How do you specify a custom MacPorts distfiles mirror? - macports

We have an isolated network with a mirror of MacPorts. I am trying to get a machine to properly reference this mirror for the various needs of port, and I have managed to get it to use our mirror for the base image and the packages directory (ala this post), but some packages don't have pre-built images in the packages directory so it tries to fetch the corresponding source package from distfiles. However, I haven't found any way to make it automatically use our mirror of distfiles. If I manually sync individual packages from our mirror to the local cache directory, that works, but I'm trying to make this work automatically and avoid the tedious process of syncing individual packages as needed. I'd also like to avoid having to sync the whole distfiles mirror locally.
I've tried to search for where the distfiles mirror list comes from to even try manually editing it, but I can't seem to find that, either.
Is there a proper way to do this?
If not, does anyone know what file I need to change to hack in our own URL?

Related

How do you manage static data for microservices?

For a database-per-service architecture, how do you guys manage your static data for each microservice? I want to make it easy for a new developer to jump in and get everything up and running easily on their local machine. I'm thinking of checking the entire database with static data into source control with Docker bind mounts so people can just docker-compose up the database service locally (along with whatever other infrastructure services they might need to run and test their microservice).
I know each microservice might need to handle this in their own way, but I'd like to provide a good default template for people to start with.
Making a standard for how to do this sort of goes against the reason for making microservices, i.e. that you can adapt each microservice to the context it exists in.
That being said, Postgres, Mongo and MySQL all run scripts in /docker-entrypoint-initdb.d when initializing a fresh database instance. The scripts have to fit the database obviously, but it's a fairly standardized way of doing it.
They all have descriptions of how to do it on the image page on docker hub.
You can either get your scripts into the container by making a custom image that contains the scripts or you can map them into the directory using a docker-compose volume mapping.
There are some databases that don't have an easy way to initialize a new database. MSSQL comes to mind. In that case, you might have to handle it programmatically.

What is the best way to add translation.json file to a React app running inside docke

I am working on a react web application, which may require multi language support. I am using i18n-next which internally loads the required configuration file from specific directory based on the language selected by user.
The word or scentences that needs to be translated may increase based the screens that user going to add and also if use adds new folder, we will loading those languages into our application.
What is the best way (I mean Scallable, Easy to configure, platform provided...) to satisfy the requirement?
( :( All I can think of is mounting an external locales folder to the folder inside container.. Is that the only way.. or something else is there..)
Note: kubernetes and rancher is there to manage. Plase provide solution/suggestion around that.
Nandri.
If you can add the files from the Storage bucket to Ci/CD & add files to the docker image and manage inside it that would be one way.
Following this way might be helpful during scaling up the application and need to manage the external locales folder and anything worried.
By external local folder mean you want to use the Host path of the node what if your node is changing by Kubernetes during maintenance how will you add the files to the node each time or manage it?
If you will use the PVC you might face the issue of readwriteonce if you are scaling the replicas you require the readwritemany. Make try to create stateless containers as much as possible.
If you can create and add the directory inside the docker image and directly use it that would be perfect or else you might could use the NFS like minio or glusterFS which support the readwritemany also.

Can one ever access a file stored in Heroku's ephemeral file system via the browser?

I've been writing an app to deploy on Heroku. So far everything has been working great. However, there's one last step that I haven't been able to solve: I need to generate a CSV file on-the-fly from the DB and let the user download the file.
On my local machine I can simply write the file to a folder under the web app root, e.g. /output, and then redirect my browser to http://localhost:4000/output/1.csv to get the file.
However, the same code fails again and again on Heroku no matter how I try to modify it. It either complains about not being able to write the file or not being able to redirect the browser to the correct path.
Even if I manually use heroku run bash and create an /output folder in the project root and create a file there, when I try to direct my browser there (e.g. https://myapp.herokuapp.com/output/1.csv, it simply says "Page not found".
Is it simply impossible to perform such an action on Heroku after all? I thought since we are free to create files on the ephemeral file system we should be able to access it from the browser as well, but things seem to be more complicated than that.
I'm using Phoenix framework and I already added
plug Plug.Static,
at: "/output", from: Path.expand("output/"), gzip: false
to my endpoint.ex. Apparently it works on localhost but not on Heroku?
I'll turn to Amazon S3 if indeed what I'm trying to do is impossible. However, I want to avoid using S3 as much as possible since this should be a really simple task and I don't want to add another set of credentials/extra complexity to manage. Or is there any other way to achieve what I'm trying to do without having to write the file to the file system first and then redirecting the user to it?
I know it doesn't strictly answer your question, but if you don't mind generating the CSV every time it is requested, you can use Controller.send_download/3 and serve arbitrary payload as a download (in your case the contents of the CSV).
Naturally you could store the generated CSVs somewhere (like the database or even ets) and generate them "lazily".

Dart: Accessing a resource out side the project|web/ directory

I have a web-app(browser based) which needs to access a folder full of icons that resides outside the web folder.
This folder MUST be outside the web folder, and would ideally exist outside the project folder all together
however, when specifying the path to the folder neither "../" or making use of a symlink will work
when the page attempts to load the image I always get
"[web] GET /Project|web/icons/img.png => Could not find asset Project|web/icons/img.png."
however I set the image source to "../icons/img.png"
how can i get dart to access this file properly
PS: I attempted a symlink to another part of the filesystem (where the images would be kept ideally) however this did not work either.
The web server integrated into DartEditor or pub serve only serves directories that are added as folders to the files view. When you add the folder to DartEditor you should be able to access the files. This is just for development.
You have also to find a solution for when you deploy your server app. It would be a hazardous security issue when you could access files outside the project directory. Where should the server draw the line? If this would be possible your entire server would be accessible to the world.
Like #Robert asked, I also have a hard time imaging why the files must not be in the project folder.
If you want to reuse the icons/images between different projects you could create a resource package that contains only those images and add them as a dependency to your project.
If you want a better answer you need to provide more information about your requirements.
If you wrote your own server (by using the HttpServer class) it may be possible to use the VirtualDirectory to server your external files.
Looking at look the dartiverse_search example may give you some ideas.
You could put them in the lib directory and refer to them via /packages/Project/...
Or in another package, in which case they would be in a different place in the file system. But as other people have said, your requirement seems odd.

Make a folder like Dropbox that connects with a remote location

How can I make a folder that does things. Surely dropbox knows when a file is put in the folder, and that file is synced. How can I make a folder that does the same, and that the files I put in it go to my ftp?
I'm trying to do this on a Mac (surely, Dropbox works fine on a Mac).
I believe what you are looking for is a way to monitor when files are changed. Then, you can simply upload the changed file via FTP like you mentioned. If this is the case, the answer is to tie into the Windows Folder and File events. Here is a good article on how to do so:
http://www.codeproject.com/KB/files/MonitorFolderActivity.aspx
The code needed to FTP a file can be found here:
http://msdn.microsoft.com/en-us/library/ms229715.aspx
All of this is assuming you are going to be using C#. If you are going to use a different language, you will need to perform the same basic actions in the same basic manner but the syntax will be different.
To get started, this is all you need. You watch the folder for changes to any of the files. When you see a change, you upload the changed file via FTP (if that is your desired method of web transport) to the remote location. Of course, you would need to do the opposite for other clients. They would need to subscribe to events on your server that told them to download the latest versions of the changed files. Finally, you would need to apply your own business logic for things like how often you want the uploads to happen, if you want logging enabled for the changes, if you are going to do file versioning, etc.
One solution(windows only + .NET) would be to run a client with and monitor a folder with FileSystemWatcher and when the change event fires, do appropriate action required to sync with FTP.

Resources