Kiwi TCMS not sending emails - kiwi-tcms

I'm new to Kiwi TCMS and i'm still on dev enviroment, trying to get this up and running.
I'm using Ubuntu 18.04 on my PC. Docker containers (kiwi_web and kiwi_db) are up and they are working properly as far as i can see.
My problem here is that i can't get Kiwi to send emails.
I've read a volume in docker-compose.yml file with as this:
- ./local-settings/:/venv/lib64/python3.6/site-packages/tcms/settings/
Then, i created ./local-settings/local_settings.py with this inside:
EMAIL_HOST = 'smtp.server.com'
EMAIL_PORT = '485'
EMAIL_HOST_USER = 'admin#server.com'
EMAIL_HOST_PASSWORD = 'password'
EMAIL_USE_TLS = True
But it didn't work.
Here is the traceback:
web_1 | [Thu Oct 03 19:25:05.240200 2019] [wsgi:error] [pid 9] Exception in thread Thread-1:
web_1 | [Thu Oct 03 19:25:05.240218 2019] [wsgi:error] [pid 9] Traceback (most recent call last):
web_1 | [Thu Oct 03 19:25:05.240221 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/threading.py", line 916, in _bootstrap_inner
web_1 | [Thu Oct 03 19:25:05.240223 2019] [wsgi:error] [pid 9] self.run()
web_1 | [Thu Oct 03 19:25:05.240225 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/threading.py", line 864, in run
web_1 | [Thu Oct 03 19:25:05.240228 2019] [wsgi:error] [pid 9] self._target(*self._args, **self._kwargs)
web_1 | [Thu Oct 03 19:25:05.240230 2019] [wsgi:error] [pid 9] File "/venv/lib64/python3.6/site-packages/django/core/mail/__init__.py", line 60, in send_mail
web_1 | [Thu Oct 03 19:25:05.240232 2019] [wsgi:error] [pid 9] return mail.send()
web_1 | [Thu Oct 03 19:25:05.240234 2019] [wsgi:error] [pid 9] File "/venv/lib64/python3.6/site-packages/django/core/mail/message.py", line 291, in send
web_1 | [Thu Oct 03 19:25:05.240236 2019] [wsgi:error] [pid 9] return self.get_connection(fail_silently).send_messages([self])
web_1 | [Thu Oct 03 19:25:05.240238 2019] [wsgi:error] [pid 9] File "/venv/lib64/python3.6/site-packages/django/core/mail/backends/smtp.py", line 103, in send_messages
web_1 | [Thu Oct 03 19:25:05.240240 2019] [wsgi:error] [pid 9] new_conn_created = self.open()
web_1 | [Thu Oct 03 19:25:05.240242 2019] [wsgi:error] [pid 9] File "/venv/lib64/python3.6/site-packages/django/core/mail/backends/smtp.py", line 63, in open
web_1 | [Thu Oct 03 19:25:05.240244 2019] [wsgi:error] [pid 9] self.connection = self.connection_class(self.host, self.port, **connection_params)
web_1 | [Thu Oct 03 19:25:05.240246 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/smtplib.py", line 251, in __init__
web_1 | [Thu Oct 03 19:25:05.240248 2019] [wsgi:error] [pid 9] (code, msg) = self.connect(host, port)
web_1 | [Thu Oct 03 19:25:05.240250 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/smtplib.py", line 336, in connect
web_1 | [Thu Oct 03 19:25:05.240252 2019] [wsgi:error] [pid 9] self.sock = self._get_socket(host, port, self.timeout)
web_1 | [Thu Oct 03 19:25:05.240254 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/smtplib.py", line 307, in _get_socket
web_1 | [Thu Oct 03 19:25:05.240256 2019] [wsgi:error] [pid 9] self.source_address)
web_1 | [Thu Oct 03 19:25:05.240258 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/socket.py", line 724, in create_connection
web_1 | [Thu Oct 03 19:25:05.240260 2019] [wsgi:error] [pid 9] raise err
web_1 | [Thu Oct 03 19:25:05.240261 2019] [wsgi:error] [pid 9] File "/opt/rh/rh-python36/root/usr/lib64/python3.6/socket.py", line 713, in create_connection
web_1 | [Thu Oct 03 19:25:05.240263 2019] [wsgi:error] [pid 9] sock.connect(sa)
web_1 | [Thu Oct 03 19:25:05.240265 2019] [wsgi:error] [pid 9] OSError: [Errno 99] Cannot assign requested address
web_1 | [Thu Oct 03 19:25:05.240271 2019] [wsgi:error] [pid 9]
Thanks!

Your docker-compose mount line is wrong, you are mounting a local directory, not a file, into the tcms/settings directory, effectivelly overriding all of the default settings of the application. Read the documentation more carefully:
https://kiwitcms.readthedocs.io/en/latest/installing_docker.html#customization

Related

Intermittent permissions issues publishing to pub/sub

We are getting intermittent permissions issues publishing from a GCP Kubernetes container to a GCP pub/sub. In one process/pid, we'll see ~50 messages go through, a few seconds of The request is missing a valid API key followed by ~50 messages going through. The full error message looks like:
{
"error": {
"code": 403,
"message": "The request is missing a valid API key.",
"status": "PERMISSION_DENIED"
}
}
Our setup:
We setup a Workload Identity User for our Kubernetes pod.
We linked that Workload Identity User to a service account.
We gave that service account publishing rights on the pub/sub topic.
We are using PHP with the vendor package google/cloud-pubsub version 1.30.2
I'd say 99% of our messages go through. But a few times a day, we get outages where the messages do not go through for a few seconds with the permissions error, and then publishing starts working again a few seconds later.
I saw another post suggesting perhaps GOOGLE_APPLICATION_CREDENTIALS could be getting in the way. We verified we are not setting GOOGLE_APPLICATION_CREDENTIALS as we are relying on the service account to give the docker containers publish access to the pub/sub topic.
An early thought experiment was suspicious of a race condition with docker images coming online/offline. But given we see a few seconds of permissions issues in a long running process makes us think it's likely something else. Does anyone have any ideas or things we should look at?
We appreciate the help!
UPDATE 2021-09-28:
Here are our logs from yesterday on when and how many permissions errors we saw against when and how many publishes went through. This is showing how many events happened per hour. This came from one docker instance:
Permissions issues:
fgrep 'The request is missing a valid API key' /log/file/was/here.log | perl -ne '/(Sep 27 [0-9]{2}):.*/; print $1 . "\n"' | uniq -c
38 Sep 27 06
90 Sep 27 07
176 Sep 27 17
186 Sep 27 18
54 Sep 27 19
No permissions issues:
fgrep 'published index requeue' /log/file/was/here.log | perl -ne '/(Sep 27 [0-9]{2}):.*/; print $1 . "\n"' | uniq -c
571 Sep 27 00
900 Sep 27 01
1117 Sep 27 02
562 Sep 27 03
3396 Sep 27 04
1767 Sep 27 05
4568 Sep 27 06
3857 Sep 27 07
2160 Sep 27 08
74 Sep 27 09
218 Sep 27 10
55 Sep 27 11
1992 Sep 27 12
4376 Sep 27 13
482 Sep 27 14
839 Sep 27 15
3533 Sep 27 16
8903 Sep 27 17
10067 Sep 27 18
9098 Sep 27 19
1006 Sep 27 20
2932 Sep 27 21
919 Sep 27 22
1104 Sep 27 23
The sum of these comes to 544 failed and 64496 success which is a failure rate of about .84% when averaged over the whole day. If looking at an hour we saw failures such as 1800Z, we see 186 failed and 10067 success or a failure rate of 1.8% for the 1800Z hour
UPDATE 2021-09-28 part 2:
Just dove into one of those hours we saw failures and grouped it down to the minute.
Permissions issues in the 1800Z hour:
fgrep 'The request is missing a valid API key' /log/file/was/here.log | perl -ne '/(Sep 27 18:[0-9]{2}):.*/; print $1 . "\n"' | uniq -c
4 Sep 27 18:01
2 Sep 27 18:02
2 Sep 27 18:03
2 Sep 27 18:04
2 Sep 27 18:05
16 Sep 27 18:10
6 Sep 27 18:12
6 Sep 27 18:16
2 Sep 27 18:20
4 Sep 27 18:21
2 Sep 27 18:30
2 Sep 27 18:32
2 Sep 27 18:33
2 Sep 27 18:34
14 Sep 27 18:35
8 Sep 27 18:36
8 Sep 27 18:37
2 Sep 27 18:38
2 Sep 27 18:40
8 Sep 27 18:41
8 Sep 27 18:42
8 Sep 27 18:43
10 Sep 27 18:44
10 Sep 27 18:45
38 Sep 27 18:46
4 Sep 27 18:47
8 Sep 27 18:51
4 Sep 27 18:58
No permissions issues in the 1800Z hour:
fgrep 'published index requeue' /log/file/was/here.log | perl -ne '/(Sep 27 18:[0-9]{2}):.*/; print $1 . "\n"' | uniq -c
249 Sep 27 18:00
187 Sep 27 18:01
156 Sep 27 18:02
266 Sep 27 18:03
302 Sep 27 18:04
149 Sep 27 18:05
29 Sep 27 18:06
572 Sep 27 18:07
499 Sep 27 18:08
395 Sep 27 18:09
189 Sep 27 18:10
257 Sep 27 18:11
194 Sep 27 18:12
185 Sep 27 18:13
94 Sep 27 18:14
49 Sep 27 18:15
71 Sep 27 18:16
77 Sep 27 18:17
36 Sep 27 18:18
34 Sep 27 18:19
73 Sep 27 18:20
114 Sep 27 18:21
82 Sep 27 18:22
87 Sep 27 18:23
20 Sep 27 18:24
39 Sep 27 18:25
56 Sep 27 18:26
31 Sep 27 18:27
79 Sep 27 18:28
82 Sep 27 18:29
270 Sep 27 18:30
32 Sep 27 18:31
174 Sep 27 18:32
165 Sep 27 18:33
91 Sep 27 18:34
213 Sep 27 18:35
243 Sep 27 18:36
208 Sep 27 18:37
92 Sep 27 18:38
76 Sep 27 18:39
211 Sep 27 18:40
269 Sep 27 18:41
164 Sep 27 18:42
322 Sep 27 18:43
228 Sep 27 18:44
294 Sep 27 18:45
242 Sep 27 18:46
318 Sep 27 18:47
145 Sep 27 18:48
42 Sep 27 18:49
115 Sep 27 18:50
206 Sep 27 18:51
214 Sep 27 18:52
213 Sep 27 18:53
169 Sep 27 18:54
169 Sep 27 18:55
208 Sep 27 18:56
190 Sep 27 18:57
88 Sep 27 18:58
43 Sep 27 18:59

PowerShell - An weird issue related to convertfrom-string

So I am trying to get average lengths of the four seasons in the 21st century using PowerShell as a self-imposed programming challenge.
My idea is to get values from a text file, create a [PSCustomObject] and assign values to its noteproperties each line, add year to the date and convert the dates to [datetime] and add to another array, then loop through the second array using index and get season length using new-timespan and add to a third array, and measure-object third array, forgive me if this may sound confusing in English but it really is very simple in code.
Now I get the dates of equinoxes and solstices from here:Solstices and Equinoxes: 2001 to 2100
Using Notepad++ to format the dates I got this:
2001 Mar 20 13:31 Jun 21 07:38 Sep 22 23:05 Dec 21 19:22
2002 Mar 20 19:16 Jun 21 13:25 Sep 23 04:56 Dec 22 01:15
2003 Mar 21 01:00 Jun 21 19:11 Sep 23 10:47 Dec 22 07:04
2004 Mar 20 06:49 Jun 21 00:57 Sep 22 16:30 Dec 21 12:42
2005 Mar 20 12:34 Jun 21 06:46 Sep 22 22:23 Dec 21 18:35
2006 Mar 20 18:25 Jun 21 12:26 Sep 23 04:04 Dec 22 00:22
2007 Mar 21 00:07 Jun 21 18:06 Sep 23 09:51 Dec 22 06:08
2008 Mar 20 05:49 Jun 21 00:00 Sep 22 15:45 Dec 21 12:04
2009 Mar 20 11:44 Jun 21 05:45 Sep 22 21:18 Dec 21 17:47
2010 Mar 20 17:32 Jun 21 11:28 Sep 23 03:09 Dec 21 23:38
2011 Mar 20 23:21 Jun 21 17:16 Sep 23 09:05 Dec 22 05:30
2012 Mar 20 05:15 Jun 20 23:08 Sep 22 14:49 Dec 21 11:12
2013 Mar 20 11:02 Jun 21 05:04 Sep 22 20:44 Dec 21 17:11
2014 Mar 20 16:57 Jun 21 10:52 Sep 23 02:30 Dec 21 23:03
2015 Mar 20 22:45 Jun 21 16:38 Sep 23 08:20 Dec 22 04:48
2016 Mar 20 04:31 Jun 20 22:35 Sep 22 14:21 Dec 21 10:45
2017 Mar 20 10:29 Jun 21 04:25 Sep 22 20:02 Dec 21 16:29
2018 Mar 20 16:15 Jun 21 10:07 Sep 23 01:54 Dec 21 22:22
2019 Mar 20 21:58 Jun 21 15:54 Sep 23 07:50 Dec 22 04:19
2020 Mar 20 03:50 Jun 20 21:43 Sep 22 13:31 Dec 21 10:03
2021 Mar 20 09:37 Jun 21 03:32 Sep 22 19:21 Dec 21 15:59
2022 Mar 20 15:33 Jun 21 09:14 Sep 23 01:04 Dec 21 21:48
2023 Mar 20 21:25 Jun 21 14:58 Sep 23 06:50 Dec 22 03:28
2024 Mar 20 03:07 Jun 20 20:51 Sep 22 12:44 Dec 21 09:20
2025 Mar 20 09:02 Jun 21 02:42 Sep 22 18:20 Dec 21 15:03
2026 Mar 20 14:46 Jun 21 08:25 Sep 23 00:06 Dec 21 20:50
2027 Mar 20 20:25 Jun 21 14:11 Sep 23 06:02 Dec 22 02:43
2028 Mar 20 02:17 Jun 20 20:02 Sep 22 11:45 Dec 21 08:20
2029 Mar 20 08:01 Jun 21 01:48 Sep 22 17:37 Dec 21 14:14
2030 Mar 20 13:51 Jun 21 07:31 Sep 22 23:27 Dec 21 20:09
2031 Mar 20 19:41 Jun 21 13:17 Sep 23 05:15 Dec 22 01:56
2032 Mar 20 01:23 Jun 20 19:09 Sep 22 11:11 Dec 21 07:57
2033 Mar 20 07:23 Jun 21 01:01 Sep 22 16:52 Dec 21 13:45
2034 Mar 20 13:18 Jun 21 06:45 Sep 22 22:41 Dec 21 19:35
2035 Mar 20 19:03 Jun 21 12:33 Sep 23 04:39 Dec 22 01:31
2036 Mar 20 01:02 Jun 20 18:31 Sep 22 10:23 Dec 21 07:12
2037 Mar 20 06:50 Jun 21 00:22 Sep 22 16:13 Dec 21 13:08
2038 Mar 20 12:40 Jun 21 06:09 Sep 22 22:02 Dec 21 19:01
2039 Mar 20 18:32 Jun 21 11:58 Sep 23 03:50 Dec 22 00:41
2040 Mar 20 00:11 Jun 20 17:46 Sep 22 09:44 Dec 21 06:33
2041 Mar 20 06:07 Jun 20 23:37 Sep 22 15:27 Dec 21 12:19
2042 Mar 20 11:53 Jun 21 05:16 Sep 22 21:11 Dec 21 18:04
2043 Mar 20 17:29 Jun 21 10:59 Sep 23 03:07 Dec 22 00:02
2044 Mar 19 23:20 Jun 20 16:50 Sep 22 08:47 Dec 21 05:43
2045 Mar 20 05:08 Jun 20 22:34 Sep 22 14:33 Dec 21 11:36
2046 Mar 20 10:58 Jun 21 04:15 Sep 22 20:22 Dec 21 17:28
2047 Mar 20 16:52 Jun 21 10:02 Sep 23 02:07 Dec 21 23:07
2048 Mar 19 22:34 Jun 20 15:54 Sep 22 08:01 Dec 21 05:02
2049 Mar 20 04:28 Jun 20 21:47 Sep 22 13:42 Dec 21 10:51
2050 Mar 20 10:20 Jun 21 03:33 Sep 22 19:29 Dec 21 16:39
2051 Mar 20 15:58 Jun 21 09:17 Sep 23 01:26 Dec 21 22:33
2052 Mar 19 21:56 Jun 20 15:16 Sep 22 07:16 Dec 21 04:18
2053 Mar 20 03:46 Jun 20 21:03 Sep 22 13:05 Dec 21 10:09
2054 Mar 20 09:35 Jun 21 02:47 Sep 22 19:00 Dec 21 16:10
2055 Mar 20 15:28 Jun 21 08:39 Sep 23 00:48 Dec 21 21:56
2056 Mar 19 21:11 Jun 20 14:29 Sep 22 06:40 Dec 21 03:52
2057 Mar 20 03:08 Jun 20 20:19 Sep 22 12:23 Dec 21 09:42
2058 Mar 20 09:04 Jun 21 02:03 Sep 22 18:07 Dec 21 15:24
2059 Mar 20 14:44 Jun 21 07:47 Sep 23 00:03 Dec 21 21:18
2060 Mar 19 20:37 Jun 20 13:44 Sep 22 05:47 Dec 21 03:00
2061 Mar 20 02:26 Jun 20 19:33 Sep 22 11:31 Dec 21 08:49
2062 Mar 20 08:07 Jun 21 01:10 Sep 22 17:19 Dec 21 14:42
2063 Mar 20 13:59 Jun 21 07:02 Sep 22 23:08 Dec 21 20:22
2064 Mar 19 19:40 Jun 20 12:47 Sep 22 04:58 Dec 21 02:10
2065 Mar 20 01:27 Jun 20 18:31 Sep 22 10:41 Dec 21 07:59
2066 Mar 20 07:19 Jun 21 00:16 Sep 22 16:27 Dec 21 13:45
2067 Mar 20 12:55 Jun 21 05:56 Sep 22 22:20 Dec 21 19:44
2068 Mar 19 18:51 Jun 20 11:55 Sep 22 04:09 Dec 21 01:34
2069 Mar 20 00:44 Jun 20 17:40 Sep 22 09:51 Dec 21 07:21
2070 Mar 20 06:35 Jun 20 23:22 Sep 22 15:45 Dec 21 13:19
2071 Mar 20 12:36 Jun 21 05:21 Sep 22 21:39 Dec 21 19:05
2072 Mar 19 18:19 Jun 20 11:12 Sep 22 03:26 Dec 21 00:54
2073 Mar 20 00:12 Jun 20 17:06 Sep 22 09:14 Dec 21 06:50
2074 Mar 20 06:09 Jun 20 22:59 Sep 22 15:04 Dec 21 12:36
2075 Mar 20 11:48 Jun 21 04:41 Sep 22 21:00 Dec 21 18:28
2076 Mar 19 17:37 Jun 20 10:35 Sep 22 02:48 Dec 21 00:12
2077 Mar 19 23:30 Jun 20 16:23 Sep 22 08:35 Dec 21 06:00
2078 Mar 20 05:11 Jun 20 21:58 Sep 22 14:25 Dec 21 11:59
2079 Mar 20 11:03 Jun 21 03:51 Sep 22 20:15 Dec 21 17:46
2080 Mar 19 16:43 Jun 20 09:33 Sep 22 01:55 Dec 20 23:31
2081 Mar 19 22:34 Jun 20 15:16 Sep 22 07:38 Dec 21 05:22
2082 Mar 20 04:32 Jun 20 21:04 Sep 22 13:24 Dec 21 11:06
2083 Mar 20 10:08 Jun 21 02:41 Sep 22 19:10 Dec 21 16:51
2084 Mar 19 15:58 Jun 20 08:39 Sep 22 00:58 Dec 20 22:40
2085 Mar 19 21:53 Jun 20 14:33 Sep 22 06:43 Dec 21 04:29
2086 Mar 20 03:36 Jun 20 20:11 Sep 22 12:33 Dec 21 10:24
2087 Mar 20 09:27 Jun 21 02:05 Sep 22 18:27 Dec 21 16:07
2088 Mar 19 15:16 Jun 20 07:57 Sep 22 00:18 Dec 20 21:56
2089 Mar 19 21:07 Jun 20 13:43 Sep 22 06:07 Dec 21 03:53
2090 Mar 20 03:03 Jun 20 19:37 Sep 22 12:01 Dec 21 09:45
2091 Mar 20 08:40 Jun 21 01:17 Sep 22 17:49 Dec 21 15:37
2092 Mar 19 14:33 Jun 20 07:14 Sep 21 23:41 Dec 20 21:31
2093 Mar 19 20:35 Jun 20 13:08 Sep 22 05:30 Dec 21 03:21
2094 Mar 20 02:20 Jun 20 18:40 Sep 22 11:15 Dec 21 09:11
2095 Mar 20 08:14 Jun 21 00:38 Sep 22 17:10 Dec 21 15:00
2096 Mar 19 14:03 Jun 20 06:31 Sep 21 22:55 Dec 20 20:46
2097 Mar 19 19:49 Jun 20 12:14 Sep 22 04:37 Dec 21 02:38
2098 Mar 20 01:38 Jun 20 18:01 Sep 22 10:22 Dec 21 08:19
2099 Mar 20 07:17 Jun 20 23:41 Sep 22 16:10 Dec 21 14:04
2100 Mar 20 13:04 Jun 21 05:32 Sep 22 22:00 Dec 21 19:51
Then I used these codes on PowerShell 7.1 x64 on Windows 10:
$timetable = Get-Content .\Desktop\Equinox-Solstice.txt | Convertfrom-String -Templatefile .\Desktop\template.txt
$count=$timetable.count
$timetable1=#()
for ($i=0;$i -lt $count;$i++) {
$year=[string]$timetable[$i].year
$mequi=[datetime]($year+" "+$timetable[$i].marequi)
$jsols=[datetime]($year+" "+$timetable[$i].junsols)
$sequi=[datetime]($year+" "+$timetable[$i].sepequi)
$dsols=[datetime]($year+" "+$timetable[$i].decsols)
$timetable1+=[pscustomobject]#{year=$year;mequi=$mequi;jsols=$jsols;sequi=$sequi;dsols=$dsols}
}
With this as template:
{[int]year*:2001} {[string]marequi:Mar 20 13:31} {[string]junsols:Jun 21 07:38} {[string]sepequi:Sep 22 23:05} {[string]decsols:Dec 21 19:22}
{[int]year*:2002} {[string]marequi:Mar 20 19:16} {[string]junsols:Jun 21 13:25} {[string]sepequi:Sep 23 04:56} {[string]decsols:Dec 22 01:15}
{[int]year*:2003} {[string]marequi:Mar 21 01:00} {[string]junsols:Jun 21 19:11} {[string]sepequi:Sep 23 10:47} {[string]decsols:Dec 22 07:04}
{[int]year*:2004} {[string]marequi:Mar 20 06:49} {[string]junsols:Jun 21 00:57} {[string]sepequi:Sep 22 16:30} {[string]decsols:Dec 21 12:42}
{[int]year*:2005} {[string]marequi:Mar 20 12:34} {[string]junsols:Jun 21 06:46} {[string]sepequi:Sep 22 22:23} {[string]decsols:Dec 21 18:35}
{[int]year*:2006} {[string]marequi:Mar 20 18:25} {[string]junsols:Jun 21 12:26} {[string]sepequi:Sep 23 04:04} {[string]decsols:Dec 22 00:22}
{[int]year*:2007} {[string]marequi:Mar 21 00:07} {[string]junsols:Jun 21 18:06} {[string]sepequi:Sep 23 09:51} {[string]decsols:Dec 22 06:08}
{[int]year*:2008} {[string]marequi:Mar 20 05:49} {[string]junsols:Jun 21 00:00} {[string]sepequi:Sep 22 15:45} {[string]decsols:Dec 21 12:04}
{[int]year*:2009} {[string]marequi:Mar 20 11:44} {[string]junsols:Jun 21 05:45} {[string]sepequi:Sep 22 21:18} {[string]decsols:Dec 21 17:47}
{[int]year*:2010} {[string]marequi:Mar 20 17:32} {[string]junsols:Jun 21 11:28} {[string]sepequi:Sep 23 03:09} {[string]decsols:Dec 21 23:38}
And when I ran the command, I got lots of error messages, all of them are like this:
InvalidArgument:
Line |
3 | $mequi=[datetime]($year+" "+$timetable[$i].marequi)
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
| Cannot convert value "2093 " to type "System.DateTime". Error: "String '2093 ' was not recognized as a valid DateTime."
I have checked thoroughly and found all of them are from $mequi, which stands for March Equinox, I realised the dates must be missing, then I typed $timetable and confirmed lots of March Equinox dates are missing, and only March Equinox dates are missing, I then used this command:
for($i=0;$i -lt 99;$i++){if ($timetable[$i].marequi -eq $null){$i}}
And found out 46 of them are missing, but the dates in other 3 categories are not missing, just March Equinox, the indexes of the missing dates are the following:
11, 15, 19, 23, 27, 31, 35, 39, 40, 43, 44, 47, 48, 51, 52, 55, 56, 59, 60, 63, 64, 67, 68, 69, 71, 72, 73, 75, 76, 77, 79, 80, 81, 83, 84, 85, 87, 88, 89, 91, 92, 93, 95, 96, 97, 98
Any ideas? Can someone help me, please?
Update3: So I used the complete code:
$timetable = Get-Content .\Desktop\Equinox-Solstice.txt | Convertfrom-String -Templatefile .\Desktop\template.txt
$count=$timetable.count
[array]$timetable1 = 0..99 | foreach-object {
$year=[string]$timetable[$_].year
$mequi=[datetime]($year+" "+$timetable[$_].marequi)
$jsols=[datetime]($year+" "+$timetable[$_].junsols)
$sequi=[datetime]($year+" "+$timetable[$_].sepequi)
$dsols=[datetime]($year+" "+$timetable[$_].decsols)
[pscustomobject]#{year=$year;mequi=$mequi;jsols=$jsols;sequi=$sequi;dsols=$dsols}
}
[array]$seasons=0..98 | foreach-object {
$year=$timetable1[$_].year
$spring=[double](New-Timespan -Start $timetable1[$_].mequi -End $timetable1[$_].jsols).totaldays
$summer=[double](New-Timespan -Start $timetable1[$_].jsols -End $timetable1[$_].sequi).totaldays
$autumn=[double](New-Timespan -Start $timetable1[$_].sequi -End $timetable1[$_].dsols).totaldays
$winter=[double](New-Timespan -Start $timetable1[$_].dsols -End $timetable1[$_+1].mequi).totaldays
[pscustomobject]#{year=$year;spring=$spring;summer=$summer;autumn=$autumn;winter=$winter}
}
$meanspring=($seasons.spring | Measure-Object -Average).average
$meansummer=($seasons.summer | Measure-Object -Average).average
$meanautumn=($seasons.autumn | Measure-Object -Average).average
$meanwinter=($seasons.winter | Measure-Object -Average).average
$meanyear=$meanspring+$meansummer+$meanautumn+$meanwinter
Write-Host "Mean Spring Length: $meanspring days, Mean Summer Length: $meansummer days, Mean Autumn Length: $meanautumn days, Mean Winter Length: $meanwinter days, Mean Solar Year Length: $meanyear days"
And this is what I got:
Mean Spring Length: 92.7203563411897 days, Mean Summer Length: 93.6684764309764 days, Mean Autumn Length: 89.8785914702581 days, Mean Winter Length: 88.9748106060606 days, Mean Solar Year Length: 365.242234848485 days.
Update: I am now editing the question because I want to share a better method.
With a little more effort at find and replace in Notepad++ I was able to obtain this:
"year","marequi","junsols","sepequi","decsols"
"2001","Mar 20 13:31","Jun 21 07:38","Sep 22 23:05","Dec 21 19:22"
"2002","Mar 20 19:16","Jun 21 13:25","Sep 23 04:56","Dec 22 01:15"
"2003","Mar 21 01:00","Jun 21 19:11","Sep 23 10:47","Dec 22 07:04"
"2004","Mar 20 06:49","Jun 21 00:57","Sep 22 16:30","Dec 21 12:42"
"2005","Mar 20 12:34","Jun 21 06:46","Sep 22 22:23","Dec 21 18:35"
"2006","Mar 20 18:25","Jun 21 12:26","Sep 23 04:04","Dec 22 00:22"
"2007","Mar 21 00:07","Jun 21 18:06","Sep 23 09:51","Dec 22 06:08"
"2008","Mar 20 05:49","Jun 21 00:00","Sep 22 15:45","Dec 21 12:04"
"2009","Mar 20 11:44","Jun 21 05:45","Sep 22 21:18","Dec 21 17:47"
"2010","Mar 20 17:32","Jun 21 11:28","Sep 23 03:09","Dec 21 23:38"
"2011","Mar 20 23:21","Jun 21 17:16","Sep 23 09:05","Dec 22 05:30"
"2012","Mar 20 05:15","Jun 20 23:08","Sep 22 14:49","Dec 21 11:12"
"2013","Mar 20 11:02","Jun 21 05:04","Sep 22 20:44","Dec 21 17:11"
"2014","Mar 20 16:57","Jun 21 10:52","Sep 23 02:30","Dec 21 23:03"
"2015","Mar 20 22:45","Jun 21 16:38","Sep 23 08:20","Dec 22 04:48"
"2016","Mar 20 04:31","Jun 20 22:35","Sep 22 14:21","Dec 21 10:45"
"2017","Mar 20 10:29","Jun 21 04:25","Sep 22 20:02","Dec 21 16:29"
"2018","Mar 20 16:15","Jun 21 10:07","Sep 23 01:54","Dec 21 22:22"
"2019","Mar 20 21:58","Jun 21 15:54","Sep 23 07:50","Dec 22 04:19"
"2020","Mar 20 03:50","Jun 20 21:43","Sep 22 13:31","Dec 21 10:03"
"2021","Mar 20 09:37","Jun 21 03:32","Sep 22 19:21","Dec 21 15:59"
"2022","Mar 20 15:33","Jun 21 09:14","Sep 23 01:04","Dec 21 21:48"
"2023","Mar 20 21:25","Jun 21 14:58","Sep 23 06:50","Dec 22 03:28"
"2024","Mar 20 03:07","Jun 20 20:51","Sep 22 12:44","Dec 21 09:20"
"2025","Mar 20 09:02","Jun 21 02:42","Sep 22 18:20","Dec 21 15:03"
"2026","Mar 20 14:46","Jun 21 08:25","Sep 23 00:06","Dec 21 20:50"
"2027","Mar 20 20:25","Jun 21 14:11","Sep 23 06:02","Dec 22 02:43"
"2028","Mar 20 02:17","Jun 20 20:02","Sep 22 11:45","Dec 21 08:20"
"2029","Mar 20 08:01","Jun 21 01:48","Sep 22 17:37","Dec 21 14:14"
"2030","Mar 20 13:51","Jun 21 07:31","Sep 22 23:27","Dec 21 20:09"
"2031","Mar 20 19:41","Jun 21 13:17","Sep 23 05:15","Dec 22 01:56"
"2032","Mar 20 01:23","Jun 20 19:09","Sep 22 11:11","Dec 21 07:57"
"2033","Mar 20 07:23","Jun 21 01:01","Sep 22 16:52","Dec 21 13:45"
"2034","Mar 20 13:18","Jun 21 06:45","Sep 22 22:41","Dec 21 19:35"
"2035","Mar 20 19:03","Jun 21 12:33","Sep 23 04:39","Dec 22 01:31"
"2036","Mar 20 01:02","Jun 20 18:31","Sep 22 10:23","Dec 21 07:12"
"2037","Mar 20 06:50","Jun 21 00:22","Sep 22 16:13","Dec 21 13:08"
"2038","Mar 20 12:40","Jun 21 06:09","Sep 22 22:02","Dec 21 19:01"
"2039","Mar 20 18:32","Jun 21 11:58","Sep 23 03:50","Dec 22 00:41"
"2040","Mar 20 00:11","Jun 20 17:46","Sep 22 09:44","Dec 21 06:33"
"2041","Mar 20 06:07","Jun 20 23:37","Sep 22 15:27","Dec 21 12:19"
"2042","Mar 20 11:53","Jun 21 05:16","Sep 22 21:11","Dec 21 18:04"
"2043","Mar 20 17:29","Jun 21 10:59","Sep 23 03:07","Dec 22 00:02"
"2044","Mar 19 23:20","Jun 20 16:50","Sep 22 08:47","Dec 21 05:43"
"2045","Mar 20 05:08","Jun 20 22:34","Sep 22 14:33","Dec 21 11:36"
"2046","Mar 20 10:58","Jun 21 04:15","Sep 22 20:22","Dec 21 17:28"
"2047","Mar 20 16:52","Jun 21 10:02","Sep 23 02:07","Dec 21 23:07"
"2048","Mar 19 22:34","Jun 20 15:54","Sep 22 08:01","Dec 21 05:02"
"2049","Mar 20 04:28","Jun 20 21:47","Sep 22 13:42","Dec 21 10:51"
"2050","Mar 20 10:20","Jun 21 03:33","Sep 22 19:29","Dec 21 16:39"
"2051","Mar 20 15:58","Jun 21 09:17","Sep 23 01:26","Dec 21 22:33"
"2052","Mar 19 21:56","Jun 20 15:16","Sep 22 07:16","Dec 21 04:18"
"2053","Mar 20 03:46","Jun 20 21:03","Sep 22 13:05","Dec 21 10:09"
"2054","Mar 20 09:35","Jun 21 02:47","Sep 22 19:00","Dec 21 16:10"
"2055","Mar 20 15:28","Jun 21 08:39","Sep 23 00:48","Dec 21 21:56"
"2056","Mar 19 21:11","Jun 20 14:29","Sep 22 06:40","Dec 21 03:52"
"2057","Mar 20 03:08","Jun 20 20:19","Sep 22 12:23","Dec 21 09:42"
"2058","Mar 20 09:04","Jun 21 02:03","Sep 22 18:07","Dec 21 15:24"
"2059","Mar 20 14:44","Jun 21 07:47","Sep 23 00:03","Dec 21 21:18"
"2060","Mar 19 20:37","Jun 20 13:44","Sep 22 05:47","Dec 21 03:00"
"2061","Mar 20 02:26","Jun 20 19:33","Sep 22 11:31","Dec 21 08:49"
"2062","Mar 20 08:07","Jun 21 01:10","Sep 22 17:19","Dec 21 14:42"
"2063","Mar 20 13:59","Jun 21 07:02","Sep 22 23:08","Dec 21 20:22"
"2064","Mar 19 19:40","Jun 20 12:47","Sep 22 04:58","Dec 21 02:10"
"2065","Mar 20 01:27","Jun 20 18:31","Sep 22 10:41","Dec 21 07:59"
"2066","Mar 20 07:19","Jun 21 00:16","Sep 22 16:27","Dec 21 13:45"
"2067","Mar 20 12:55","Jun 21 05:56","Sep 22 22:20","Dec 21 19:44"
"2068","Mar 19 18:51","Jun 20 11:55","Sep 22 04:09","Dec 21 01:34"
"2069","Mar 20 00:44","Jun 20 17:40","Sep 22 09:51","Dec 21 07:21"
"2070","Mar 20 06:35","Jun 20 23:22","Sep 22 15:45","Dec 21 13:19"
"2071","Mar 20 12:36","Jun 21 05:21","Sep 22 21:39","Dec 21 19:05"
"2072","Mar 19 18:19","Jun 20 11:12","Sep 22 03:26","Dec 21 00:54"
"2073","Mar 20 00:12","Jun 20 17:06","Sep 22 09:14","Dec 21 06:50"
"2074","Mar 20 06:09","Jun 20 22:59","Sep 22 15:04","Dec 21 12:36"
"2075","Mar 20 11:48","Jun 21 04:41","Sep 22 21:00","Dec 21 18:28"
"2076","Mar 19 17:37","Jun 20 10:35","Sep 22 02:48","Dec 21 00:12"
"2077","Mar 19 23:30","Jun 20 16:23","Sep 22 08:35","Dec 21 06:00"
"2078","Mar 20 05:11","Jun 20 21:58","Sep 22 14:25","Dec 21 11:59"
"2079","Mar 20 11:03","Jun 21 03:51","Sep 22 20:15","Dec 21 17:46"
"2080","Mar 19 16:43","Jun 20 09:33","Sep 22 01:55","Dec 20 23:31"
"2081","Mar 19 22:34","Jun 20 15:16","Sep 22 07:38","Dec 21 05:22"
"2082","Mar 20 04:32","Jun 20 21:04","Sep 22 13:24","Dec 21 11:06"
"2083","Mar 20 10:08","Jun 21 02:41","Sep 22 19:10","Dec 21 16:51"
"2084","Mar 19 15:58","Jun 20 08:39","Sep 22 00:58","Dec 20 22:40"
"2085","Mar 19 21:53","Jun 20 14:33","Sep 22 06:43","Dec 21 04:29"
"2086","Mar 20 03:36","Jun 20 20:11","Sep 22 12:33","Dec 21 10:24"
"2087","Mar 20 09:27","Jun 21 02:05","Sep 22 18:27","Dec 21 16:07"
"2088","Mar 19 15:16","Jun 20 07:57","Sep 22 00:18","Dec 20 21:56"
"2089","Mar 19 21:07","Jun 20 13:43","Sep 22 06:07","Dec 21 03:53"
"2090","Mar 20 03:03","Jun 20 19:37","Sep 22 12:01","Dec 21 09:45"
"2091","Mar 20 08:40","Jun 21 01:17","Sep 22 17:49","Dec 21 15:37"
"2092","Mar 19 14:33","Jun 20 07:14","Sep 21 23:41","Dec 20 21:31"
"2093","Mar 19 20:35","Jun 20 13:08","Sep 22 05:30","Dec 21 03:21"
"2094","Mar 20 02:20","Jun 20 18:40","Sep 22 11:15","Dec 21 09:11"
"2095","Mar 20 08:14","Jun 21 00:38","Sep 22 17:10","Dec 21 15:00"
"2096","Mar 19 14:03","Jun 20 06:31","Sep 21 22:55","Dec 20 20:46"
"2097","Mar 19 19:49","Jun 20 12:14","Sep 22 04:37","Dec 21 02:38"
"2098","Mar 20 01:38","Jun 20 18:01","Sep 22 10:22","Dec 21 08:19"
"2099","Mar 20 07:17","Jun 20 23:41","Sep 22 16:10","Dec 21 14:04"
"2100","Mar 20 13:04","Jun 21 05:32","Sep 22 22:00","Dec 21 19:51"
Save as seasons.csv
Now use these codes:
$Seasons=Import-Csv path\to\seasons.csv | % {
$MEqui=[DateTime]$([string]($_.Year )+$_.marequi)
$JSols=[DateTime]$([string]($_.Year )+$_.junsols)
$SEqui=[DateTime]$([string]($_.Year )+$_.sepequi)
$DSols=[DateTime]$([string]($_.Year )+$_.decsols)
[pscustomobject]#{mequi=$mequi;jsols=$jsols;sequi=$sequi;dsols=$dsols}
}
$Seasons | %{
$MEqui=$MEqui.ToString(yyyy-MM-ddTHH:mm:ssZ)
$JSols=$JSols.ToString(yyyy-MM-ddTHH:mm:ssZ)
$SEqui=$SEqui.ToString(yyyy-MM-ddTHH:mm:ssZ)
$DSols=$DSols.ToString(yyyy-MM-ddTHH:mm:ssZ)
[pscustomobject]#{MarchEquinox=$mequi;JuneSolstice=$jsols;SeptemberEquinox=$sequi;DecemberSolstice=$dsols} | Export-Csv path\to\output.csv -NoTypeInformation -Append
}
And the rest of the steps are the same (just replace $timetable1 with $seasons), this approach is better because importing from and exporting to csv is much easier and far more reliable.
For the template issue, it appears that you have only one sample value (Jun 21) for the junsols column, if you update that, you will see that it works just fine.
In general, I would recommend the following template to cover all the dates:
{[int]year*:0000} {[string]marequi:Mar 00 00:00} {[string]junsols:Jun 00 00:00} {[string]sepequi:Sep 00 00:00} {[string]decsols:Dec 00 00:00}
{[int]year*:9999} {[string]marequi:Mar 99 99:99} {[string]junsols:Jun 99 99:99} {[string]sepequi:Sep 99 99:99} {[string]decsols:Dec 99 99:99}
Also (as a side note): try to avoid using the increase assignment operator (+=) to create a collection as it is exponential expensive

how do i transform a data array into another format?

I have a const reserved: Date[] which looks like this:
0: Thu Oct 01 2020 02:00:00 GMT+0200 (Ora legale dell’Europa centrale) {}
1: Sat Oct 10 2020 02:00:00 GMT+0200 (Ora legale dell’Europa centrale) {}
2: Fri Oct 30 2020 01:00:00 GMT+0100 (Ora standard dell’Europa centrale) {}
3: Sat Oct 31 2020 01:00:00 GMT+0100 (Ora standard dell’Europa centrale) {}
4: Fri Oct 02 2020 02:00:00 GMT+0200 (Ora legale dell’Europa centrale) {}
but I would like this array to present itself to me like this:
0: 2020/10/01
1: 2020/10/10
2: 2020/10/30
3: 2020/10/31
4: 2020/10/02
I try to transform the date through this way:
for (const element of reservedArray) {
this.shareDate.newdate = this.shareDate.pipe.transform(element, 'yyyy/MM/dd');
this.shareDate.arraydateRes.push(new Date(this.shareDate.newdate));
console.log(this.shareDate.arraydateRes);
}
}
the result of the console.log is this:
(10) [Thu Oct 01 2020 00:00:00 GMT+0200 (Ora legale dell’Europa centrale), Sat Oct 10 2020 00:00:00
GMT+0200 (Ora legale dell’Europa centrale), Fri Oct 30 2020 00:00:00 GMT+0100 (Ora standard dell’Europa
centrale), Sat Oct 31 2020 00:00:00 GMT+0100 (Ora standard dell’Europa centrale), Fri Oct 02 2020
00:00:00 GMT+0200 (Ora legale dell’Europa centrale), Sat Oct 03 2020 00:00:00 GMT+0200 (Ora legale
dell’Europa centrale), Sun Oct 04 2020 00:00:00 GMT+0200 (Ora legale dell’Europa centrale), Mon Oct 05
2020 00:00:00 GMT+0200 (Ora legale dell’Europa centrale), Tue Oct 06 2020 00:00:00 GMT+0200 (Ora legale
dell’Europa centrale), Wed Oct 07 2020 00:00:00 GMT+0200 (Ora legale dell’Europa centrale)]
I searched online but not being very proficient in angular yet I have not found a suitable solution!
i don't understand why i get this kind of result!
you would have some suggestions?
thank you so much :)
try
import { formatDate } from '#angular/common';
reservedDate: Date[];
reservedString: string[];
reservedString = reservedDate.map(d => formatDate(d, 'yyyy/MM/dd', 'en'));
you must using the momment library :
1 : install that by this command :npm i jalali-moment
2 : import that : import * as moment from 'jalali-moment';
3 : and create a function in share Service :
transformDate(date: any, format = 'YYYY/MM/DD'): string {
const MomentDate = moment(date, format);
return MomentDate.locale('en').format(format);
}
4 : finally you must just use this function and pass data to that.

CKAN error 500 internal error on front page?

Any pages I make changes in CKAN always render a error 500. Recently, the homepage will not work. This is the traceback I get.
[Wed Nov 20 09:24:40.170268 2019] [wsgi:error] [pid 1217:tid 140172777490176] Traceback (most recent call last):
[Wed Nov 20 09:24:40.170285 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/pysolr.py", line 366, in _send_request
[Wed Nov 20 09:24:40.170296 2019] [wsgi:error] [pid 1217:tid 140172777490176] timeout=self.timeout)
[Wed Nov 20 09:24:40.170308 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 488, in get
[Wed Nov 20 09:24:40.170321 2019] [wsgi:error] [pid 1217:tid 140172777490176] return self.request('GET', url, **kwargs)
[Wed Nov 20 09:24:40.170334 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
[Wed Nov 20 09:24:40.170345 2019] [wsgi:error] [pid 1217:tid 140172777490176] resp = self.send(prep, **send_kwargs)
[Wed Nov 20 09:24:40.170356 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 596, in send
[Wed Nov 20 09:24:40.170369 2019] [wsgi:error] [pid 1217:tid 140172777490176] r = adapter.send(request, **kwargs)
[Wed Nov 20 09:24:40.170379 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/adapters.py", line 487, in send
[Wed Nov 20 09:24:40.170392 2019] [wsgi:error] [pid 1217:tid 140172777490176] raise ConnectionError(e, request=request)
[Wed Nov 20 09:24:40.170405 2019] [wsgi:error] [pid 1217:tid 140172777490176] ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8983): Max retries exceeded with url: /solr/select/?sort=view_recent+desc&fq=%2Bcapacity%3Apublic+capacity%3A%22public%22&fq=%2Bsite_id%3A%221%22&fq=%2Bstate%3Aactive&facet.mincou$
[Wed Nov 20 09:24:40.463679 2019] [wsgi:error] [pid 1217:tid 140172777490176] 2019-11-20 09:24:40,463 ERROR [pysolr] Failed to connect to server at 'http://127.0.0.1:8983/solr/select/?sort=score+desc%2C+metadata_modified+desc&fq=owner_org%3A%22a515ac6d-0a66-4565-b727-914fd27b290a%22&fq=%2Bsite_id%3A%221%22&fq=%2Bst$
[Wed Nov 20 09:24:40.463751 2019] [wsgi:error] [pid 1217:tid 140172777490176] Traceback (most recent call last):
[Wed Nov 20 09:24:40.463761 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/pysolr.py", line 366, in _send_request
[Wed Nov 20 09:24:40.463769 2019] [wsgi:error] [pid 1217:tid 140172777490176] timeout=self.timeout)
[Wed Nov 20 09:24:40.463777 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 488, in get
[Wed Nov 20 09:24:40.463786 2019] [wsgi:error] [pid 1217:tid 140172777490176] return self.request('GET', url, **kwargs)
[Wed Nov 20 09:24:40.463795 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
[Wed Nov 20 09:24:40.463803 2019] [wsgi:error] [pid 1217:tid 140172777490176] resp = self.send(prep, **send_kwargs)
[Wed Nov 20 09:24:40.463811 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/sessions.py", line 596, in send
[Wed Nov 20 09:24:40.463819 2019] [wsgi:error] [pid 1217:tid 140172777490176] r = adapter.send(request, **kwargs)
[Wed Nov 20 09:24:40.463827 2019] [wsgi:error] [pid 1217:tid 140172777490176] File "/usr/lib/ckan/default/lib/python2.7/site-packages/requests/adapters.py", line 487, in send
[Wed Nov 20 09:24:40.464403 2019] [wsgi:error] [pid 1217:tid 140172777490176] raise ConnectionError(e, request=request)
[Wed Nov 20 09:24:40.464420 2019] [wsgi:error] [pid 1217:tid 140172777490176] ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8983): Max retries exceeded with url: /solr/select/?sort=score+desc%2C+metadata_modified+desc&fq=owner_org%3A%22a515ac6d-0a66-4565-b727-914fd27b290a%22&fq=%2
Does anyone know what caused the exact problem? Is it a wsgi error? I installed ckan from package so I do not know what exactly is wrong. Thanks in advance!
EDIT: I managed to fix it! I forgot to uncomment the entries I changed #solr_url to solr_url and #JETTY_HOST to JETTY_HOST.
You are getting ConnectionError, Max retries exceeded when the CKAN tries to connect to your Solr instance.
Did you setup solr? If so, please check solr_url inside your .ini file, the current setup tries to connect to http://127.0.0.1:8983?

How to sum values from an Array of objects

I have a data structure in Ruby as below:
[["N1-Alb",
{'Sun, 05 Feb 2017'=>"",
'Mon, 06 Feb 2017'=>"",
'Tue, 07 Feb 2017'=>"",
'Wed, 08 Feb 2017'=>"0.25",
'Thu, 09 Feb 2017'=>"0.03",
'Fri, 10 Feb 2017'=>"",
'Sat, 11 Feb 2017'=>""}],
["N1-Cet",
{'Sun, 05 Feb 2017'=>"",
'Mon, 06 Feb 2017'=>"7.8",
'Tue, 07 Feb 2017'=>"",
'Wed, 08 Feb 2017'=>"0.00",
'Thu, 09 Feb 2017'=>"",
'Fri, 10 Feb 2017'=>"",
'Sat, 11 Feb 2017'=>""}],
["N3-Tju",
{'Sun, 05 Feb 2017'=>"",
'Mon, 06 Feb 2017'=>"",
'Tue, 07 Feb 2017'=>"",
'Wed, 08 Feb 2017'=>"3.15",
'Thu, 09 Feb 2017'=>"",
'Fri, 10 Feb 2017'=>"8.0",
'Sat, 11 Feb 2017'=>""}],
["N7-Mlp",
{'Sun, 05 Feb 2017'=>"",
'Mon, 06 Feb 2017'=>"",
'Tue, 07 Feb 2017'=>"5.01",
'Wed, 08 Feb 2017'=>"0.03",
'Thu, 09 Feb 2017'=>"",
'Fri, 10 Feb 2017'=>"",
'Sat, 11 Feb 2017'=>"4"}]]
How can I get sum for all Sundays, Mondays etc. up to Saturdays separately in to a Hash or an Array format?
The final hash should be:
result = { 'sun': '0',
'mon': '7.8',
'tue': '5.01',
'wed': '3.43',
'thu': '0.03',
'fri': '8.0',
'sat': '4' }
Try this:
days = [:mon, :tue, :wed, :thu, :fri, :sat, :sun]
result = your_hash.each_with_object({}) do |n, h|
n[1].each do |key, value|
h[days[key.cwday - 1]] = (h[days[key.cwday - 1]].to_f + value.to_f).to_s
end
end
I updated the code to be more concise, per your request. This code exploits the fact that nil.to_f == 0.0, which may upset some stomachs.

Resources