I'm making a dating sim as my first game and I want to use a JSON file to store all of my dialogue, sprites and B.G.M. kind of like a A.P.I; but the file I wrote won't appear in the godot filesystem section.
I can't get the file path without it, is there a way for it to appear or should I just give up.
Try using the File API and read the data into the variable. Like so:
var data ={}
var path = "res://data.json"
func _ready():
var jsonfile = File.new()
jsonfile.open(path, File.READ)
data = parse_json(jsonfile.get_as_text())
print(data)
pass
Also you can install JSON Editor asset from Godot marketplace - https://godotengine.org/asset-library/asset/656
Related
I was thinking if I store a video or a movie and open that box will that video will be stored in my RAM or else it just load from ROM. I am a bit confused: Can anyone explain this to me?
I think you have misunderstood the concept of Database.
Any Database solution is to only store pure informational organized data. Not to store large files such as media, documents, or images.
On the contrary, storage need not be organized, all files can exist in one folder.
So, any database solution you use, always store Data Types.
In this case you can have a Data Model, which is also an essential thing in using a Database.
#HiveType(typeId: 0)
class Movie extends HiveObject {
#HiveField(0)
String name;
#HiveField(1)
int path;
}
Since Hive supports Dart objects, you don't have to convert toJson or any such for string the Data.
So when you have the file fetched from Storag, you can get the path using path_provider or from the File itself, and then Create a Object
File file = await // get the movie file using any means
final path = file.path
var box = await Hive.openBox('Movies');
var m = Movie()
..name = 'Batman Begins'
..path = path ;
box.add(m);
m.save();
Hope this clears your doubt.
Copy/save your video/media files in the Local File Storage and save file path in Hive Box.
Whenever you need get path from hive then get the file from local storage using that path.
I'm fairly new to Dart and Flutter, and I'm having trouble to overwrite an existing assets image from a source image.
My attempt:
try {
File localFile = File('assets/images/myImage.png');
localFile.writeAsBytesSync(originFile.readAsBytesSync());
catch (e) {
log(e.toString());
}
I get:
[log] FileSystemException: Cannot open file, path = 'assets/images/myImage.png' (OS Error: No such file or directory, errno = 2)
I did define the assets folder in pubspec.yaml:
assets:
- assets/images/
Ok, so I've read somewhere that the asset file can be accessed like this:
import 'package:flutter/services.dart' show rootBundle;
final byteData = await rootBundle.load('assets/images/myImage.png');
But I don't know how to convert byteData to a File object that represents the actual file.
I think I'm missing something very basic here. Or maybe is there is a proper way to do this that has nothing to do with this approach?
Please help.
Thanks in advance!
If you want to write a file on a user device you should look here: https://flutter.dev/docs/cookbook/persistence/reading-writing-files
Shared preferences are a space in a phone where you app can write, so it's exactly what you want!
Assets are part of you app and are not meant to be modified within the app.
During a build, Flutter places assets into a special archive called
the asset bundle that apps read from at runtime. According to the flutter website
Hope this helps!
The workflow of my function is the following:
retrieve a jpg through python get request
save image as png (even though is downloaded as jpg) on disk
use imageio to read from disk image and transform it into numpy array
work with the array
This is what I do to save:
response = requests.get(urlstring, params=params)
if response.status_code == 200:
with open('PATH%d.png' % imagenumber, 'wb') as output:
output.write(response.content)
This is what I do to load and transform png into np.array
imagearray = im.imread('PATH%d.png' % imagenumber)
Since I don't need to store permanently what I download I tried to modify my function in order to transform the response.content in a Numpy array directly. Unfortunately every imageio like library works in the same way reading a uri from the disk and converting it to a np.array.
I tried this but obviously it didn't work since it need a uri in input
response = requests.get(urlstring, params=params)
imagearray = im.imread(response.content))
Is there any way to overcome this issue? How can I transform my response.content in a np.array?
imageio.imread is able to read from urls:
import imageio
url = "https://example_url.com/image.jpg"
# image is going to be type <class 'imageio.core.util.Image'>
# that's just an extension of np.ndarray with a meta attribute
image = imageio.imread(url)
You can look for more information in the documentation, they also have examples: https://imageio.readthedocs.io/en/stable/examples.html
You can use BytesIO as file to skip writing to an actual file.
bites = BytesIO(base64.b64decode(response.content))
Now you have it as BytesIO, so you can use it just like a file:
img = Image.open(bites)
img_np = np.array(im)
I'm actually struggeling with a problem handling some kml files with google map in my Javascript application.
I wrote a method with that I'm reading a KML file from an URL or my local file system and storing the content as a String in a Database. Now i would like to activate layers that are stored in my db by clicking a button. Everything is fine up to here.
In every example i can find they are only using the url-attribute of a KmlLayer by passing an url to a KML-File.
like here:
var ctaLayer = new google.maps.KmlLayer({
url: 'http://googlemaps.github.io/js-v2-samples/ggeoxml/cta.kml',
map: map
});
But since my files are stored as Strings in my db I don't have an url to a file, only the content. I can't find a way to only pass the XML-String as content.
Somebody here who can help?
Maybe someday somebody will struggle with a similar problem. The solution was a little bit tricky. I needed to create a Blob with the content of my String. With the blob I created a file and packed it into an URL. This URL you can pass to your kml parser. I used https://github.com/geocodezip/geoxml3 for that.
vm.activeLayers.forEach(function(value, key) {
var file = new Blob([value], {type: 'kml'})
var url = URL.createObjectURL(file);
var myParser = new geoXML3.parser({
map : map
});
myParser.parse(url);
})
I am new to iphone development. I am writing my app in Mono touch.
I am trying to get a photo from the asset library which I can do successfully however I get the path as a URL "assets-library://asset/asset.JPG". I want to do a FileStream read which asks for a filepath. How can I convert NSURL to a file path?
I used asset.DefaultRepresentation.Url.AbsoluteString which gives the following filePath:
assets-library://asset/asset.JPG?id=1000000001&ext=JPG
then when it hits the FileStream(filePath, FileMode.Open, FileAccess.Read) it gives an exception stating that Could not find part of the path.
Please Help:)
The url you get from the assets library is not a file path. It is a url to a resource in the assets library, so you cannot use FileStream to read its data.
If the asset is an image, you can get its contents to an NSData object like this:
UIImage image = UIImage.FromImage(asset.DefaultRepresentation.GetImage());
NSData data = image.AsJPEG();
You can use two different static methods to load an image. FromImage and FromBundle. The first is asynchronous. The latter is synchronous (when an image is loaded the it is cashed by iOS). Both take the path to your image.
When you use these methods, property image must be set to content property. Right click on image, property -> set build action to content.