Installing datashader from bokeh channel - package

The question is simple: How do I install the datashader package?
The problem however might be a bit more complicated. I already tried several things and ended up with not finding the datashader package. I ran the following commands in my anaconda prompt:
[Anaconda2] C:\Users\Vuk>anaconda search -t conda datashade
Run 'anaconda show <USER/PACKAGE>' to get more details:
Packages:
Name | Version | Package Types | Platforms
------------------------- | ------ | --------------- | ---------------
ahmadia/datashader | 0.1.0 | conda | linux-64, osx-64
bokeh/datashader | 0.1.0 | conda | linux-64, osx-64
Found 2 packages
[Anaconda2] C:\Users\Vuk>anaconda show bokeh/datashader
Name: datashader
Summary:
Access: public
Package Types: conda
Versions:
+ 0.1.0
To install this package with conda run:
conda install --channel https://conda.anaconda.org/bokeh datashader
[Anaconda2] C:\Users\Vuk>conda install --channel https://conda.anaconda.org/bokeh datashader
Fetching package metadata: ........
Error: No packages found in current win-64 channels matching: datashader
Did you mean one of these?
datashape
You can search for this package on anaconda.org with
anaconda search -t conda datashader
As you can see, in first instance it says the package is there, but when I try to install it, it can't find it anymore.
Do you know what the problem could be?

We're building Windows packages now, and will let you know when they are ready! The existing packages are for other platforms.

Related

What is the package name of Gnome Web / Epiphany?

I want to install Gnome web using command line on Ubuntu Deskop 1804. I cannot use software center. Using apt not snap.
Unfortunately Ubuntu software center does not display package names. Only version information.
I tried apt install epiphany, gnome-web-browser without result.
What is the package name of Gnome Web / Epiphany? This seems to be a carefully guarded secret.
The package name is epiphany-browser.
[onknows:~/git/tpelcm/ansible] master(+12/-7)* ± sudo apt-cache madison epiphany-browser
epiphany-browser | 3.28.6-0ubuntu1 | http://nl.archive.ubuntu.com/ubuntu bionic-updates/universe amd64 Packages
epiphany-browser | 3.28.1-1ubuntu1 | http://nl.archive.ubuntu.com/ubuntu bionic/universe amd64 Packages

ROS apriltags3 isntalling with warning - preventing package identification

I am attempting to get ROS to run with the april tags library for robotics research. I am fairly new to ros and dont really know where to start with this troubleshooting.
OS: Ubuntu 18.04.4
ROS distro: melodic
I have followed the quick start tutorial from the april tag's github page
everything seems to happen without a hiccup until the catkin build command. The packages are found and all the dependencies are successfully installed. Then the packages claim to be successfully installed after the catkin build command is complete.
robertslab#robertslab-HP-Pavilion-Gaming-Laptop-15-cx0xxx:~/april_3$ catkin build
------------------------------------------------------------
Profile: default
Extending: [env] /opt/ros/melodic
Workspace: /home/robertslab/april_3
------------------------------------------------------------
Build Space: [exists] /home/robertslab/april_3/build
Devel Space: [exists] /home/robertslab/april_3/devel
Install Space: [unused] /home/robertslab/april_3/install
Log Space: [missing] /home/robertslab/april_3/logs
Source Space: [exists] /home/robertslab/april_3/src
DESTDIR: [unused] None
------------------------------------------------------------
Devel Space Layout: linked
Install Space Layout: None
------------------------------------------------------------
Additional CMake Args: None
Additional Make Args: None
Additional catkin Make Args: None
Internal Make Job Server: True
Cache Job Environments: False
------------------------------------------------------------
Whitelisted Packages: None
Blacklisted Packages: None
------------------------------------------------------------
Workspace configuration appears valid.
NOTE: Forcing CMake to run for each package.
------------------------------------------------------------
[build] Found '2' packages in 0.0 seconds.
[build] Updating package table.
Starting >>> catkin_tools_prebuild
Finished <<< catkin_tools_prebuild [ 1.5 seconds ]
Starting >>> apriltag
___________________________________________________________
Warnings << apriltag:install /home/robertslab/april_3/logs/apriltag/build.install.000.log
cp: cannot create regular file '/home/robertslab/.local/lib/python3.6/site-packages': No such file or directory
cd /home/robertslab/april_3/build/apriltag; catkin build --get-env apriltag | catkin env -si /usr/bin/make install; cd -
...........................................................
Finished <<< apriltag [ 7.0 seconds ]
Starting >>> apriltag_ros
Finished <<< apriltag_ros [ 15.2 seconds ]
[build] Summary: All 3 packages succeeded!
[build] Ignored: None.
[build] Warnings: 1 packages succeeded with warnings.
[build] Abandoned: None.
[build] Failed: None.
[build] Runtime: 23.7 seconds total.
[build] Note: Workspace packages have changed, please re-source setup files to use them.
Can someone please explain this warning.
Additionally, when I start Ros core and to a package search there is no April tag package installed and I am not sure why or how to install the package to Ros.
I have resourced the setup.bash file like the output says and april tags still does not show up in the rospack list output.
What am I missing?
Well this warning is quite self explanatory in which it is not able to copy some file to directory. I think it is looking for Python source directory which is not found.
apriltag_ros is dependent on april_tag these are two different packages.
As this package is officially on ROS Melodic so you can install it by:
sudo apt install ros-melodic-apriltag-ros
This will take care of all the dependencies needed and builds the project.
You can check the installtion directory by:
roscd apriltag_ros
pwd

Go vendoring on App Engine Standard

I know this topic has been referenced a few times already. Unfortunately I still wasn't able to find a working solution for my use case.
I can't seem to get vendoring working for my Go application on App Engine Standard. I'm using dep for vendoring.
I'm building a GraphQL API and here is my folder structure:
/GOPATH
└──/src
└──/acme
├──/app
| ├── app.yaml
| └── app.go
├──/src
| ├── /mutations/
| ├── /queries/
| └── /types/
└──/vendor/
Running goapp serve app/app.yaml on Cloud Shell fails with
INFO 2018-05-14 15:42:08,386 devappserver2.py:764] Skipping SDK update check.
INFO 2018-05-14 15:42:08,471 api_server.py:268] Starting API server at: http://0.0.0.0:47213
INFO 2018-05-14 15:42:08,600 dispatcher.py:199] Starting module "default" running at: http://0.0.0.0:8080
INFO 2018-05-14 15:42:08,601 admin_server.py:116] Starting admin server at: http://0.0.0.0:8000
ERROR 2018-05-14 15:42:13,983 go_runtime.py:181] Failed to build Go application: (Executed command: /google/go_appengine/goroot/bin/go-app-builder -app_base /home/xxx/gopath/src/acme/app -arch 6 -dynamic -goroot /google/go_appengine/goroot -gopath /home/xxx/gopath:/google/gopath -nobuild_files ^^$ -incremental_rebuild -unsafe -binary_name _go_app -extra_imports appengine_internal/init -work_dir /tmp/tmpbt8DA2appengine-go-bin -gcflags -I,/google/go_appengine/goroot/pkg/linux_amd64_appengine -ldflags -L,/google/go_appengine/goroot/pkg/linux_amd64_appengine app.go)
/home/xxx/gopath/src/acme/vendor/github.com/graphql-go/graphql/definition.go:4: can't find import: "context"
2018/05/14 15:42:09 Can't find package "context" in $GOPATH: cannot find package "context" in any of:
/home/xxx/gopath/src/acme/vendor/context (vendor tree)
/google/go_appengine/goroot/src/context (from $GOROOT)
/home/xxx/gopath/src/context (from $GOPATH)
/google/gopath/src/context
Looks like the problem might be that one vendor is not using a full dependency name for "context".
(EDIT: probably not the case though since I’m using 1.8)
Has anyone ever managed to successfully deploy on App Engine Standard using vendoring? Been pulling my hair all day on this.
Just in case anyone else struggles with this, this is the approach I've taken that seems to work for me.
Directory structure looks like this:
/GOPATH
├──/appengine
| ├──/.git/
| ├──/project1
| | ├── app.yaml
| | └── app.go
| └──/project2
| ├── app.yaml
| └── app.go
└──/src
├──/project1
| ├──/.git/
| ├──/mutations/
| ├──/queries/
| ├──/types/
| ├──/vendor/
| └──/main.go
└──/project2
├──/.git/
├──/foo/
├──/bar/
├──/vendor/
└──/main.go
Each app.go file below the appengine folder contains:
package projectX
import "projectX"
func init() {
projectX.Run()
}
Each main.go file below src/projectX contains:
package projectX
import (
// Import whatever you need
"google.golang.org/appengine"
)
func Run() {
// Do whatever you need
appengine.Main()
}
Seems that having the folder that contains app.yaml outside of $GOPATH/src is indeed necessary.
This is also not ideal for version control if you need to have each project versioned under their own git repo as opposed to having one monolyth repo. I solved this by versioning each project AND versioning the appengine folder as well separately.
I was having issues with, spent ages looking around trying to figure out why it wasn't working. This answer is a little late but hopefully will be useful for anyone else who has this issue.
I updated to use Go 1.11 which I thought wasn't supported (found one of the GCP examples on github using it here)
Set runtime: go111 in app.yaml and it will support vendoring and give you a link to a proper build log.
Now my directory structure is as follows.
/GOPATH
└──/src
├──/project1
| ├──/.git/
| ├──/whateverCode/
| ├──/vendor/
| └──/main.go
| └──/app.yaml
I assume if it supports Go 1.11 we could also use Modules for versioning but I haven't looked into that yet.
The context package will be inside $GOROOT ( not in vendor directory ). Probably your Go Appengine SDK is old and does not support Go 1.8.
Update your SDK to the latest version. The $GOROOT should be like /path/to/somewhere/goroot-1.8

How to find cause of Warning: PropTypes has been moved to a separate package

If I get the warning "Warning: PropTypes has been moved to a separate package." How can I locate which npm package is still using it? The warning doesnt offer any details about what file or package is causing it.
React deprecated the usage of propTypes from their main package so you can't use React.PropTypes. When you use React.PropTypes it gives you a warning but when you use propTypes from prop-types package you are good.
That's It :)
You can use this knowledge to find the list of npm packages which are using it through the following command.
find ./node_modules -type f -print0 | xargs -0 grep 'PropTypes' | cut -d/ -f3 | sort | uniq | xargs -I{} grep -L 'prop-types' ./node_modules/{}/package.json
The above command will find all the npm packages having PropTypes word present in any of their files, then it looks into the package.json file of that package to check whether the prop-types package is included or not. If prop-types package is missing then it prints the path of that package.
PS: I'm no bash expert so I've taken little help from this serverfault answer. (To find the unique npm packages containing the PropTypes word)
PPS: The answer assumes that you are using a Unix machine.
This is just an idea so might not be that useful or there would be a better solution but you can use npm ls command to get the dependency tree. Then you can locate the packages that depends on react but not prop-types. This is sort of a manual solution and might be a better one.
More info about npm ls here
Synopsis
npm ls [[<#scope>/]<pkg> ...]
aliases: list, la, ll
Description
This command will print to stdout all the versions of packages that
are installed, as well as their dependencies, in a tree-structure.

conda packages with version name of 'custom'

When I using conda search anaconda I found a few custom version packages, shown as follow:
Fetching package metadata: ....
anaconda 1.6.0 np17py33_0 defaults
... ... ...
4.0.0 np110py35_0 defaults
4.0.0 np110py34_0 defaults
4.0.0 np110py27_0 defaults
custom py35_0 defaults
custom py34_0 defaults
custom py27_0 defaults
Note that these three custom version pkgs are shown at the end of conda search anaconda results, so they are considered the newest version by conda, which also affects conda install anaconda results (so I have to using conda install anaconda=4.0.0).
Then conda info anaconda=custom gives following results:
Fetching package metadata: ....
anaconda custom py35_0
----------------------
file name : anaconda-custom-py35_0.tar.bz2
name : anaconda
version : custom
build number: 0
build string: py35_0
channel : defaults
size : 3 KB
date : 2016-03-14
license : BSD
md5 : 47c237b38bfc175cb73aed8b8b33ade7
space : python
installed environments:
dependencies:
python 3.5*
anaconda custom py34_0
----------------------
file name : anaconda-custom-py34_0.tar.bz2
name : anaconda
version : custom
build number: 0
build string: py34_0
channel : defaults
size : 3 KB
date : 2016-03-14
license : BSD
md5 : 767a59923372d998b8c83fb16ac035a1
space : python
installed environments:
dependencies:
python 3.4*
anaconda custom py27_0
----------------------
file name : anaconda-custom-py27_0.tar.bz2
name : anaconda
version : custom
build number: 0
build string: py27_0
channel : defaults
size : 3 KB
date : 2016-03-14
license : BSD
md5 : 8288aef529d5a46d07bd84b4fcf4308a
space : python
installed environments:
dependencies:
python 2.7*
BUT I don't know/remeber HOW and WHY these three packages appear in this computer, can anyone explain:
How these custom version pkgs are created in the first place?
How/Why these custom version pkgs are shown in the conda search results?
How to remove these custom version pkgs?
The one custom version of any package that exists (right now, in the official repos) is for the anaconda package.
Here's there reason... The anaconda conda packages are metapackages, meaning they are packages of packages--or packages that have no real source code and only bring in a bunch of dependencies. Each anaconda package has every sub-package pinned to an explicit and specific version of that sub-package. That's because Continuum does extensive testing on the interoperability of that set of packages (and those specific versions).
Now, after you've installed anaconda, either through the Anaconda Installer or installing Miniconda and then conda install anaconda, you have a set of packages with all of these tested guarantees. There's no reason you have to stick to this locked set of packages--you can install anything and any version you want. You no longer have a version-identifiable Anaconda Distribution though. You've customized it. Thus, when you run conda list and the version of the anaconda package shows custom, you know you've diverged from the set of packages in the Anaconda Distribution that are robustly tested for interoperability.
Your conda search anaconda query just reflects an artifact of how this is implemented. You'll notice in that query that custom packages are listed first, meaning they have the lowest sort order when comparing versions. Thus, if you run conda update anaconda after you've diverged from the specifically-pinned anaconda packages, you'll be back to a numbered version of the Anaconda Distribution.
This is really a partial answer. I'm not positive why exactly this version exists.
(1) In terms of the specific version value of custom it seems this is allowed from here:
version: string
The package version, which may not contain -. Conda acknowledges PEP 440.
So this anaconda package would be created in the same way as any of the other versions. I would assume using conda build.
(2) They are shown in the search results because they exist in the anaconda cloud. It seems this is an officially released version of anaconda.
As for why it exists, if you download one of the actual package files (for example linux-64-anaconda-custom-py35_0.tar.bz2), expand it, and read the info/index.json file it looks like this package will simply install python and the other bare bones needs. Compare this to anaconda version 4.0.0, or one of the others, and you will see a ton of packages. I assume this package exists so that if someone installs the custom version they will just get the bare bones packages and then they go through conda install-ing any others they want.
For example, look at the packages when you do conda create -n anc-test anaconda=4.0.0 vs. conda create -n anc-test anaconda=custom.
EDIT: Just saw that that is also in your conda info so you are probably already aware of the difference in dependencies.
(3) I don't think you can remove these custom packages from your search call as they are legitimate packages in the anaconda cloud. You might be able to exclude them from the conda search via regex. It doesn't look like from your output that they have been installed -- at least not in the current environment.

Resources