'clarinet integrate' quickly fails and nothing is logged to console? - reactjs

Following https://docs.hiro.so/smart-contracts/devnet I can't get the command clarinet integrate to work. I have installed Docker on my mac and am running version 0.28.0 of clarinet. Running command within 'my-react-app/clarinet' where all clarity related files live (contracts, settings, tests, and Clarinet.toml).
My guess is it could be an issue with Docker?

The issue was that I downloaded my Devnet.toml file from a repo that was configured incorrectly. The configuration I needed was:
[network]
name = "devnet"
I increased the CPU and Memory in Docker as well.
There is an issue when the command attempts to spin up the stacks explorer, but I was informed that there are several existing issues with the stacks explorer from clarinet integrate at the moment.

Depending on how the last devnet was terminated, you could have some containers running. This issue should be fixed in the next incoming release, meanwhile, you'd need to terminate this stale containers manually.

Apart from Ludo's suggestions, I'd also look into your Docker resources. The default CPU/memory allocation should allow you to get started with Clarinet, but just in case, you could alter it to see if that makes a difference. Here's my settings for your reference:
Alternatively, to tease things out, you could reuse one of the samples (eg: hirosystems/stacks-billboard) instead of running your project. See if the sample comes up as expected; if it does, there could be something missing in your project.

Related

Unable to save file changes while using VS Code Remote Container due to "read-only file system"

I'm experimenting with using dev-containers for development by trying to follow along with this simple example: https://github.com/microsoft/vscode-remote-try-python
The setup works fine and I am able to build and connect to the container, and run the app just fine. However, if I try to edit anything and save it, I get an error:
Failed to save 'app.py': Unable to write file 'vscode-remote://dev-container+2f55736572732f62726164656e2e6b696e6172642f706572736f6e616c2f7673636f64652d72656d6f74652d7472792d707974686f6e/workspaces/vscode-remote-try-python/app.py' (Unknown (FileSystemError): Error: EROFS: read-only file system, open '/workspaces/vscode-remote-try-python/app.py')
If I open a secondary window with the local folder open, I can save changes and those are reflected in the remote container window. But due to the file system being set to read-only, I can't edit anything from within the remote container. Any ideas on why I am stuck in read-only?
One potentially important note is that I am using using Colima (version 0.2.2) rather than Docker Desktop, thought I haven't found anything to indicate that this would be an issue.
I found the answer to my own problem. Turns out the issue was using Colima as a runtime. I came across the discussion around issue #102 on the Colima Github page. According to the developer, the default mount "Used to be read-only, but changed to writable in version 0.3.0.". I was using v0.2.2.
I updated colima to the most recent version (v0.4.4) and it fixed the issue for me.

How to (semi-)automatically sync local files with remote devcontainer?

My Goal
I've been using devcontainers in combination with WSL2 for a little while now. But I keep running into issues and besides that I like off-loading resources of my laptop to a server. Moving the containers to a native Linux server would solve my issues.
My ideal situation would be to have a solution that works just like working locally on my Windows laptop (later probably moving to Macbook) but using the facilities of a Linux server (which has systemd and netns) and moving the workload there as well so my laptop doesn't sound like a vacuum cleaner.
My Journey
I'm trying to setup remote containers as described here: https://code.visualstudio.com/remote/advancedcontainers/develop-remote-host
Actually the containers are running fine, I'm using the second storage solution what means I add the following to my .devcontainer.json:
"workspaceMount": "source=/home/marvink/code,target=/workspaces,type=bind,consistency=cached"
And my workflow currently looks something like this:
Clone project locally (with .devcontainer already in the project)
Add workspaceMount above to devcontainer.json
Clone project on remote (e.g. to /home/marvink/code/new-project)
Open project locally
Build and reopen in container
Work on the files on the remote
My issue
This works but now I have files on my local drive that never get touched which isn't ideal but not a disaster, the bigger issue is when I want to update the devcontainer. I need to do that locally (in a seperate window), manually need to copy paste that to the remote if I want to commit it to git and off-course I sometimes forget this and try to edit it remotely which is causing a lot of frustration (and sometimes it seems like it does use the remote config, but that might have been a mistake?).
This is why I want to setup rsync both ways to sync changes to files and as a bonus I can edit files locally when I'm offline. In the link it's described how to do it manually but I want it automated so that I can't forget or make mistakes.
From Powershell I'm able to run an rsync command that syncs one-way and I can extend that to sync 2-way:
wsl rsync -rlptzv --progress --exclude=.git '$PWD' 'marvink#s-dev01:~/code/new-project'
This needs to be ran locally but I can't find a way to do that. I'd need to run a task locally for example, but that isn't possibly when working on a remote (https://github.com/microsoft/vscode-remote-release/issues/168).
The other way around doesn't seem like an option to me as I don't want to expose any ports on my laptop and firewalls would get in the way depending on where I am.
My question
My workflow still seems a bit convoluted so I'm open to suggestions on that end but any ideas on how to sync my workspace files?
You don't need a local version of your code (containing the .devcontainer folder) if you're storing that code on the remote server. You should be able to open an ssh target in VScode using the Remote - SSH extension, which is the recommended approach in the link you added. The extension Remote - Containers 'stacks' on top of the SSH extension, so once connected over SSH you then connect to the container using the .devcontainer.json configuration located on your remote server.
If you don't want to use the extension and use a bind mount + specify docker.host in your settings.json file, you can sync code using the approaches in that same article, through SSHFS, docker machine, or rsync.

"Operation in progress..." Never Ends When Previewing Rows in Kettle Spoon via Salesforce

I am trying to pull data from a Salesforce API account using Pentaho/Kettle Spoon. I am able to establish a connection on Pentaho with this account. I am also able to get fields from specific modules. However, when I try and "Preview rows" (even with a small number of rows), the "Operation in Progress" window comes up and never completes the task. When I try and cancel the job, Pentaho hangs and I have to force quit it.
I am running Fedora 21. Any input is greatly appreciated.
Thanks!
This works for me:
Since upgrading to Fedora 22 Spoon (the client tool of PDI) was not working properly anymore. Although I could start Spoon properly and create transformations etc, once I wanted to execute a transformation or sometimes even when trying to open settings, nothing was working and the terminal window showed several (SWT:20352): GLib-CRITICAL error messages. In a nutshell, Spoon was rendered useless.
Here is how to solve this:
Go to the Eclipse Download Page and download the latest 64bit verion of Eclipse IDE for Java EE Developers. Note: There is a separate download area for SWT; however, for Linux there is no 64 bit version available. The only way to get one is to build it yourself. So I just went the easier way and downloaded Eclipse instead.
Unzip the file and search for swt. A search result will show a few files, but the ones interesting for us are (your version number might be different): org.eclipse.swt_3.103.2.v20150203-1313.jar and org.eclipse.swt.gtk.linux.x86_64_3.103.2.v20150203-1351.jar.
Copy the first one of these files into <PDI_HOME>/libswt/linux/x86 and the second into <PDI_HOME>/libswt/linux/x86_64.
oth folders still have the original jar files in them. Rename them to swt-jar-old (Note: no extension, so that they are not picked up).
Start Spoon. There will be a few error messages shown, but so far Spoon is working way better for me than before.
FONT: http://diethardsteiner.github.io/pdi/2015/06/07/Fixing-PDI-GLib-CRITICAL.html
i'm running Ubuntu 14.04, i5,6gb RAM, got same trouble..
when i check, the load in proccessor is 100%, evertime i try showing the content or other thing in pentaho, the Operation in progress is show up, and never end..
How to solve, check your access to file/connection in your database,unrecognize file,field,etc. after this resolve, my proccessor run normal, and i could run / execute the transformation.

Real remote editing without X-Forwarding, using Vim or the like

I'm currently working an a rather large web project which is written using C servlets ( utilizing GWAN Web server ). In the past I've used a couple of IDEs for my LAMP/PHP jobs, like Eclipse.
My problems with Eclipse are that you can either mirror the project locally, which isn't possible in this case as I'm working on a Mac (server does not run on OSX), or use the "remote" view, which would re-upload files when you save them.
In the later case, the file is only partly written while uploading, which makes this a no-go for a running web server, or the file could become corrupted if the connection was lost during uploading. Also, for changing some character, uploading the whole file seems rather inefficient to me.
So I was thinking:
Wouldn't it be possible to have the IDE open Vim per SSH and mirror my changes there, and then just :w (save) ? Or use some kind of diff-files for changes?
The first one would be preffered, as it has the added advantage of Vim .swp files, which makes it possible that others know when someone is already editing the file.
My current solution is using ssh+vim, but then I lose all the cool features I have with Eclipse and other more advanced IDEs.
Also, regarding X-Forwarding: The reason I don't like it is speed. It feels way slower than just editing locally, and takes up unneeded bandwidth, when all I want to do is basically "text editing".
P.S.: I couldn't find any more appropriate tags for the question, especially no "remote" tag, but if you know any, feel free to add them. Also, if there is another similar question, feel free to point it out - I couldn't find any.
Thank you very much.
If you're concerned about having to transmit the entire file for minor changes, the only solution that comes to my mind is running (either continuously, or on demand) an rsync job that mirrors the remote site to your local system (and back). The rsync protocol just transmits the delta information. According to Are rsync operations atomic at file level?, the change is atomic.
Another possibility: run everything in a virtual machine on your Mac. The server and the IDE/text editor are both on the same virtual machine so you don't have to fear network issues.
Because the source code on the virtual machine is under some kind of VCS the classic code → test → commit process is trivial (at least theoretically).

Two way sync with rsync

I have a folder a/ and a remote folder A/.
I now run something like this on a Makefile:
get-music:
rsync -avzru server:/media/10001/music/ /media/Incoming/music/
put-music:
rsync -avzru /media/Incoming/music/ server:/media/10001/music/
sync-music: get-music put-music
when I make sync-music, it first gets all the diffs from server to local and then the opposite, sending all the diffs from local to server.
This works very well only if there are just updates or new files on the future. If there are deletions, it doesn't do anything.
In rsync there is --delete and --delete-after options to help accomplish what I want but thing is, it doesn't work on a 2-way-sync.
If I want to delete server files on a syn, when local files have been deleted, it works, but if, for some reason (explained after) I have some files that aren't in the server but exist locally and they were deleted, I want locally to remove them and not server copied (as it happens).
Thing is I have 3 machines in context:
desktop
notebook
home-server
So, sometimes, server will have files that were deleted with a notebook sync, for example and then, when I run a sync with my desktop (where the deleted server files still exist on) I want these files to be deleted and not to be copied again to the server.
I guess this is only possible with a database and track of operations :P
Any simpler solutions?
Thank you.
Try Unison: http://www.cis.upenn.edu/~bcpierce/unison/
Syntax:
unison dirA/ dirB/
Unison asks what to do when files are different, but you can automate the process by using the following which accepts default (nonconflicting) options:
unison -auto dirA/ dirB/
unison -batch dirA/ dirB/ asks no questions at all, and writes to output how many files were ignored (because they conflicted).
Note: I am no longer using Unison (I use NextCloud, which doesn't address the original use case). However, note that rsync is not designed for bidirectional sync, while unison is. unison may have its bugs (as any other piece of software) and its wrinkles. I am surprised it seems to be actively maintained now (last time I looked I think I thought it looked dead), but I'm not sure what's the state nowadays. I haven't had the need to have a two-way file synchronizer, so there may be better options, though.
Since the original question also involves a desktop and laptop and example involving music files (hence he's probably using a GUI), I'd also mention one of the best bi-directional, multi-platform, free and open source programs to date: FreeFileSync.
It's GUI based, very fast and intuitive, comes with filtering and many other options, including the ability to remote connect, to view and interactively manage "collisions" (in example, files with similar timestamps) and to switch between bidirectional transfer, mirroring and so on.
FreeFileSync can easily sync two computers on the same network and also sync two computers on different and remote networks.
On same network: have FreeFileSync use the local file system on one side and a shared network drive / path on the other. On Windows systems you enable file / disk sharing on one computer and access that share from the other. I use FreeFileSync this way to keep my main development PC source code synced with my 2 laptops.
I have also synced one of these laptops with a Linux server with Samba installed and sharing one of its directories.
Across networks: create a VPN and do the same as above. FreeFileSync will see the remote disk as it was on the local network. Or buy one router that allows you to connect a USB disk to it and share over the internet. I have installed a VPN on a remote Linux server and used it through the OpenVPN Windows client.
You could also try bitpocket: https://github.com/sickill/bitpocket
Try this,
get-music:
rsync -avzru --delete-excluded server:/media/10001/music/ /media/Incoming/music/
put-music:
rsync -avzru --delete-excluded /media/Incoming/music/ server:/media/10001/music/
sync-music: get-music put-music
I just test this and it worked for me. I'm doing a 2-way sync between Windows7 (using cygwin with the rsync package installed) and FreeNAS fileserver (FreeNAS runs on FreeBSD with rsync package pre-installed).
You might use Osync: http://www.netpower.fr/osync , which is rsync based with intelligent deletion propagation. it has also multiple options like resuming a halted execution, soft deletion, and time control.
You could try csync, it is the sync engine under the hood of owncloud.
I'm surprised no one has mentioned Syncthing yet. I have been using it for years to synchronize my phone, my tablet and my two laptops. One time I also used it to send 10 GB of photos to my family ~600 km away, straight from my machine to their machine, and it was incredibly fast (despite the data getting routed through Syncthing's discovery server to work around NAT issues). I also tried OwnCloud/NextCloud at some point but Syncthing has been much more reliable and, also, much faster.
I'm now using SparkleShare https://www.sparkleshare.org/
works on mac, linux and windows.
I'm not sure whether it works with two syncing but for the --delete to work you also need to add the --recursive parameter as well.
Rclone is what you are looking for. Rclone ("rsync for cloud storage") is a command line program to sync files and directories to and from different cloud storage providers including local filesystems. Rclone was previously known as Swiftsync and has been available since 2013.

Resources