I am trying to send a message containing Arabic text content. For sure it's available in JavaMail API, but how ? What if i want to write in both English and Arabic languages, is that possible? then how ?
//////I tried the following but didn't work:
message.setContent(messageText,"text/plain;charset=UTF-8");
message.setHeader("Content-Transfer-Encoding","Base64");
//////The output always of :
message.getSubject() is ==> ??????
You'll also need to use the MimeMessage setSubject method that allows you to specify a charset. Using UTF-8 should work with most mail readers these days. You do need to be sure you're using Java Strings with the correct Unicode characters to begin with. If you're reading the data from somewhere, there's lots of opportunities to mess up the character encodings and conversions.
Related
I created a logic app which gets the data from restapi. The content type of data is Application/protobuf(protobuf is a binary data made by google https://developers.google.com/protocol-buffers/docs/tutorials) I know that LA uses base64 encoding so it changes the data to base64 encoding. As the data to be deserialised via protoc complier so it needed binary format. So i store this data into a variable and trying to paas this further processing. But unfortunately the data stores in variable changes into boxes and ? So protoc compiler fails to deserialised the same.
I tried base64tobinary and http() as per suggestion mentioned section "other content type" on microsoft https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-content-type#applicationxml-and-applica...
But it did not work. Can anybody help on this?
I tried reproducing the issue and found that the rest API body which we are receiving will be transformed back to string when we use string() i.e.. string(triggerBody()) dynamic expression.
Later, I convert to binary by using the dynamic expression 'binary', i.e. 'binary(outputs('Compose')', which will return a 'application/octet-stream' result. When I tried to convert back to a string, everything worked great for me.
Here is the flow of my logic app
and here is the output
I have changed the way of using. Instead of streaming, I am storing the data into file and processing further.
After testing URL in the google structured data testing tool, I don't know why the question mark character is shown instead of utf-8 characters?
what is wrong? any help really appreciated.
the url is : link
and the result image:
The underlying json is:
don't know why, but the structured data tool doesn't understand your farsi. If farsi is an encoded javascript, like \u0645\u0648\u0633\u0633\u0647 - there is no problem. But if it is written like علائم تیروئید کم کار - something wired happens.
Fast and dirty solution: encode all of your structured data content as encoded javascript. In Notepad++ this makes a plugin named HTML Tag - then ctrl+j.
I'm trying to connect to mongoDB with the new password with special char as shown in the screenshot. It gives error as URL malformed due to special character in it.
Initially I was able to connect with normal password.
I tried to add escape characters in password with no luck.
Does anybody have came across this situations or know the solution? Any leads highly appreciated[![enter image description here][1]][1]. Thanks.
Click on the Fill in connection fields individually then input the fields.
It will create the URI with escaped characters for you.
If your password includes special characters you need to encode all special characters. you can find it on this link: https://www.w3schools.com/tags/ref_urlencode.asp
I have a grid and arrived data (Ext Direct), for example (firebug):
But in the grid you see:
If I click double to show full data, in the title of window, this data does not have trouble (come from the record):
The other columns render datas without problem. Do you have any idea to fix it?
I also live in a country where we need the accented characters (Slovakia) and I went through all possible encodings and troubles with localization. After all, UTF-8 is the real solvent. Since I use UTF-8 in database, in apache (Ajax) headers and on the HTML on the page I have no problmes.
Therefore, check if somewhere in the route the encoding does not change, check if UTF-8 is really sent from the server, check all headers, etc. The only problem that UTF-8 ceased to be used somewhere on the route.
i am new in cakephp and i am developing a website in arabic language, and i used a slug in arabic characters but my problem is that the cache file name show characters like Ù«, ØÃ, ì, ù, à in place of normal characters.
example :
post_من-Ù†ØÙ† (the slug is: من-نحن)
post_٬مŠ-ومدØÙع-كمروي-ÙÙŠ-تØرÙات-ااØ
so how to do to get a cache file name like that :
post_من-نحن instead of post_من-Ù†ØÙ†
As far as I know this is a PHP problem, the filesystem API (just like so many other) simply doesn't support unicode, so you'll end up with the ANSI representation of your string.
https://bugs.php.net/bug.php?id=46990
https://www.google.com/search?q=php+bugs+unicode+filesystem+support&hl=en
You might be able to get it working using the COM extension, however that would require OS specific code, and if it works at all it would also require changes to the CakePHP core, or you would have to use your own caching mechanism, so it's probably easier to simply live with it, and wait for the legendary PHP6 with its full unicode support ‹’’›(Ͼ˳Ͽ)‹’’›