I am trying to display real-time traffic information on Here maps along my route. However it doesn't seem to be showing up on the route that I have selected but it shows up in other places. Could someone shed some light on this issue? I have also attached screenshots of my application.Screenshot 1 Screenshot 2
private void createRoute(RouteOptions.TransportMode transportMode, GeoCoordinate startCoodinate, GeoCoordinate endCoodinate) {
CoreRouter coreRouter = new CoreRouter();
RoutePlan routePlan = new RoutePlan();
RouteWaypoint startPoint = new RouteWaypoint(startCoodinate);
RouteWaypoint destination = new RouteWaypoint(endCoodinate);
routePlan.addWaypoint(startPoint);
routePlan.addWaypoint(destination);
RouteOptions routeOptions = new RouteOptions();
routeOptions.setTransportMode(transportMode);
routeOptions.setHighwaysAllowed(false);
routeOptions.setRouteType(RouteOptions.Type.SHORTEST);
routeOptions.setRouteCount(1);
routePlan.setRouteOptions(routeOptions);
trafficUpdater = TrafficUpdater.getInstance();
trafficUpdater.enableUpdate(true);
TrafficUpdater.GetEventsListener myGetEventsListener = new TrafficUpdater.GetEventsListener() {
#Override
public void onComplete(List<TrafficEvent> trafficEvent, TrafficUpdater.Error error) {
if (error == TrafficUpdater.Error.NONE) {
// how to use the callback trafficEvent to affect m_mapRoute ?????
} else {
}
}
};
TrafficUpdater.Listener myTrafficUpdaterListener = new TrafficUpdater.Listener() {
#Override
public void onStatusChanged(TrafficUpdater.RequestState requestState) {
if (requestState.equals(TrafficUpdater.RequestState.DONE)) {
trafficUpdater.getEvents(m_mapRoute.getRoute(), myGetEventsListener);
}
}
};
CoreRouter.Listener myCoreRouterListener = new CoreRouter.Listener() {
#Override
public void onProgress(int i) {
/* The calculation progress can be retrieved in this callback. */
}
#Override
public void onCalculateRouteFinished(List<RouteResult> routeResults, RoutingError routingError) {
if (routingError == RoutingError.NONE) {
if (routeResults.get(0).getRoute() != null) {
m_mapRoute = new MapRoute(routeResults.get(0).getRoute());
trafficUpdater.request(m_mapRoute.getRoute(), 100, myTrafficUpdaterListener);
m_map.addMapObject(m_mapRoute);
m_mapRoute.setRenderType(MapRoute.RenderType.SECONDARY);
GeoBoundingBox gbb = routeResults.get(0).getRoute()
.getBoundingBox();
m_map.zoomTo(gbb, Map.Animation.NONE, Map.MOVE_PRESERVE_ORIENTATION);
} else {
Toast.makeText(m_activity,
“Error:route results returned is not valid”,
Toast.LENGTH_LONG).show();
}
} else {
Toast.makeText(m_activity,
“Error:route calculation returned error code: ” + routingError,
Toast.LENGTH_LONG).show();
}
}
};
coreRouter.calculateRoute(routePlan, myCoreRouterListener);
}
The results of the info that I am getting back is given below:
Traffic flow message.
Event Text: Traffic flow message.
Short Text: FLOW
Severity: HIGH
Affected Streets: [Norra Vallgatan]
Affected Length: 500
From Streets: [Hamngatan]
To Streets: [Slottsgatan]
Speed Limit: 17
Flow: True
Incident: False
Visible: True
Penalty: 0
HIGH
17
Norra Vallgatan
That is the callback result from calling TrafficUpdater.GetEventsListener
Edit: I've included the coordinates below:
address =[Triangeln ],
start location =[Lat: 55.61151575555467, Long: 12.994461363051823, Alt: 0.0],
end location =[Lat: 55.59671, Long: 13.00098, Alt: 1.073741824E9]
address =[Gothenburg ],
start location =[Lat: 55.61163785303746, Long: 12.994434358214308, Alt: 0.0],
end location =[Lat: 57.70068, Long: 11.96823, Alt: 1.073741824E9]
Hope someone can help!
Disable Traffic Auto Update and also request traffic updates only on your route like shown below. In your code you have requested for traffic updates in and around 100 meters radius of your route. That could be a reason.
map.disableTrafficAutoUpdate();
trafficUpdater.request(mapRoute.getRoute(), myTrafficUpdaterListener);
Related
I am facing issue while getting data from response and adding data inside loop sometimes it skips data while adding it to roomDB
//below is code snippet
fun subscribeFeed() {
try {
val test =
apolloClient(mContext!!).subscribe(FeedSubscription(Channel.NORMAL))
val callback =
object : ApolloSubscriptionCall.Callback<FeedSubscription.Data> {
override fun onResponse(response: com.apollographql.apollo.api.Response<FeedSubscription.Data>) {
if (response.data?.Feed?.asMessage?.__typename.equals("Message")) {
for (jj in response.data?.Feed?.asMessage?.Ids!!.indices) {
try {
val noteDateAdded = Date()
val newNote = ChatAllMsg(
noteDateAdded,
1,
response.data?.Feed?.asMessage!!.text ?: "",
response.data?.Feed?.asMessage?.Ids!![jj],
response.data?.Feed?.asMessage?.type.toString(),
"",
"issent",
response.data?.Feed?.asMessage?.time?.toFloat(),
0,
"",
response.data?.Feed?.asMessage?.Type?.toString()
)
MainScope().launch {
noteDatabase.addMsg(newNote)
}
} catch (e: Exception) {
Log.e("catcherror", e.message.toString())
}
}
}
}
}
test.execute(callback)
} catch (exception: ApolloException) {
throw exception
}
}
inside maincope.launch{} i am adding data to database after getting reponse.
My main issue is that it skips the data and does not add it to database but i am getting the data in reponse.
Help is very much appreciated.
I'm studying webRTC, with video chat project.
Now I'm really exhausted.. I don't know why it doesn't work.
my video stream that I can get works pretty well on the browser.
But, when I try to make a join it with new browser, there is no error messages.
and I can expect there are 'Two Video' view, but there is not. only one.
actually there are two video view(script), but peer's video view doesn't work.
I can find the peer's stream data on console.log(peerStream). It looks similar to myStream data.
but it doesn't work
here is the code
async function getMedia(deviceId) {
const initialConstraints = {
audio: true,
video: { facingMode: "user" },
};
const cameraConstraints = {
audio: true,
video: { deviceId: { exact: deviceId } },
};
try {
myStream = await navigator.mediaDevices.getUserMedia(
deviceId ? cameraConstraints : initialConstraints
);
// stream을 mute하는 것이 아니라 HTML video element를 mute한다.
console.log("myVideo : ", myStream, myVideo)
addVideoStream(myVideo.current, myStream);
//videoGrid.current.append(myVideo.current);
if (!deviceId) {
// mute default
myStream //
.getAudioTracks()
.forEach((track) => (track.enabled = false));
await getCameras();
}
} catch (error) {
console.log(error);
} }
function paintPeerFace(peerStream, id, remoteNickname) {
console.log("peerStream : ", peerStream, id, remoteNickname);
const peerVideo = document.createElement("video");
console.log("const peerVideo : ", peerVideo)
peerVideo.setAttribute("autoplay", "playsinline");
// peerVideo.autoplay = true;
// peerVideo.playsInline = true;
peerVideo.width = "400";
peerVideo.height = "400";
peerVideo.className = id;
console.log("const peerVideo : ", peerVideoTemp);
addVideoStream(peerVideoTemp.current, peerStream);
videoGrid.current.append(peerVideo);
setUsers(videoGrid.current.childElementCount);
//sortStreams();
}
and git storage address :
https://github.com/jsw4215/webRTC_prac.git
server : Express
client : React
with Readme, you can see it on the browser.
Thank you for your help! and I really appreciate it!
please help me!!
I am creating reportviewer using Syncfusion in angular JS in Asp.net MVC
I have downloaded the community version of syncfusion
My APIController
public class Pay_rep_ReportViewerApiController : ApiController, IReportController
{
public object PostReportAction(Dictionary<string, object> jsonResult)
{
try
{
var output = ReportHelper.ProcessReport(jsonResult, this);
}
catch (Exception exxxx)
{
}
return ReportHelper.ProcessReport(jsonResult, this);
}
//Get action for getting resources from the report
[System.Web.Http.ActionName("GetResource")]
[AcceptVerbs("GET")]
public object GetResource(string key, string resourcetype, bool isPrint)
{
return ReportHelper.GetResource(key, resourcetype, isPrint);
}
//Method will be called when initialize the report options before start processing the report
public void OnInitReportOptions(ReportViewerOptions reportOption)
{
if (reportOption.ReportModel.ReportPath.Contains("RSSMRP01.rdlc"))
{
reportOption.ReportModel.ReportPath = System.Web.HttpContext.Current.Server.MapPath("~/App_Data/RSSMRP01.rdlc");
}
if (reportOption.ReportModel.ReportPath.Contains("~/Views/PayModule/Pay_rep_employee.rdlc"))
{
reportOption.ReportModel.ReportPath = System.Web.HttpContext.Current.Server.MapPath(reportOption.ReportModel.ReportPath);
}
}
//Method will be called when reported is loaded
public void OnReportLoaded(ReportViewerOptions reportOption)
{
}
}
in JSController
$scope.reportServiceUrl = '/api/Pay_rep_ReportViewerApi';
$scope.remoteMode = ej.ReportViewer.ProcessingMode.Local;
$scope.rdlReportPath = '~/Views/PayModule/Pay_rep_employee.rdlc';
In my view
<div id="container" ej-reportviewer e-reportserviceurl="reportServiceUrl"
e-processingmode="remoteMode" e-isresponsive="true"
e-reportpath="rdlReportPath" style="width:100%;height:680px;"
e-datasources="dataSource">
</div>
I get json array in ApiController when I keep breakpoint
{{ inProgress = completed,
isReportLoad = True,
paramInfo = NoParams,
reportViewerToken = 930af59b-fd2d-473c-b453-e8df4343ecc7,
reportViewerID = 930af59b-fd2d-473c-b453-e8df4343ecc7,
dataSources = , dataSets = , reportParameters = , isRDLC = False,
errorInfo = , serviceType = Default }}
but on the screen with reportviewer I get "The given key was not present in the dictionary.", but the server demo works fine .
References added
Is it feasible to solve it using community version? Thanks in advance
I am using the mp4Parser isoviewer-1.0-RC-35.jar to combine clips recorded with the android MediaRecorder. The clips seem to get combined correctly by listening to the audio tracks, but the video stays on one frame and the time code stays at zero on play back.
Media Recorder Code at time individual clips are created
mediaRecorder = new MediaRecorder();
myCamera.lock();
myCamera.unlock();
String clipLocation = file.getAbsolutePath();
_moviePaths.add(clipLocation);
// Please maintain sequence of following code.
// If you change sequence it will not work.
mediaRecorder.setCamera(myCamera);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
if (facingBack) {
mediaRecorder.setOrientationHint(90);
} else {
mediaRecorder.setOrientationHint(270);
}
// Log.v("cam","supported vid sizes: "+
// myCamera.getParameters().getSupportedVideoSizes());
CamcorderProfile profile = CamcorderProfile
.get(CamcorderProfile.QUALITY_720P);
// mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
//mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
// mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mediaRecorder.setMaxDuration(g.kMaxVideoDurationInMiliseconds);// 15seconds
mediaRecorder.setProfile(profile);
mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
mediaRecorder.setOutputFile(path + filename);
mediaRecorder.prepare();
startTimer();
mediaRecorder.start();
}
Method i am using to combine the clips:
protected void combineClips() throws IOException{
for(int i=0; i<_moviePaths.size();i++){
Movie tm = MovieCreator.build(_moviePaths.get(i));
_clips.add(tm);
}
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
for (Movie m : _clips) {
for (Track t : m.getTracks()) {
if (t.getHandler().equals("soun")) {
audioTracks.add(t);
}
if (t.getHandler().equals("vide")) {
videoTracks.add(t);
}
}
}
Movie result = new Movie();
Log.v("cam", "adding:"+audioTracks.size()+" audio tracks and "+videoTracks.size()+" video tracks");
if (audioTracks.size() > 0) {
result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
}
if (videoTracks.size() > 0) {
result.addTrack(new AppendTrack(videoTracks.toArray(new Track[videoTracks.size()])));
}
Container out = new DefaultMp4Builder().build(result);
FileChannel fc = new RandomAccessFile(String.format(videoFolder.getPath()+"/output.mp4"), "rw").getChannel();
out.writeContainer(fc);
fc.close();
}
Apparently the problem had something to do with the library: isoviewer-1.0-RC-35.jar. I replaced it with isoviewer-1.0-RC-27.jar and now everything is just dandy!
To grab some content from a WCF Data Service into my View Model is straight forward:
public const string RequestsPropertyName = "Requests";
private DataServiceCollection<Request> _requests = null;
public DataServiceCollection<Request> Requests
{
get { return _requests; }
set
{
if (_requests == value) { return; }
var oldValue = _requests;
_requests = value;
RaisePropertyChanged(RequestsPropertyName, oldValue, value, true);
}
}
and then
Requests.LoadAsync(query);
But what if I have a property which is not a collection?
public const string RequestDetailsPropertyName = "RequestDetails";
private Request _requestDetails = null;
public Request RequestDetails
{
get { return _requestDetails; }
and so on.
Where do I get the 'LoadAsync(query)' method from?
Thank you,
Ueli
This is a pretty simple thing to do. You just need to use the DomainContext in your application. This is where you create your query from, then apply the result to your property.
Here is an example of what this might look like in your code:
void LoadRequest(int requstID)
{
var query = workContext.GetRequestByIDQuery(requestID);
workContext.Load(query, lo =>
{
DispatcherHelper.CheckBeginInvokeOnUI(() =>
{
if (lo.HasError)
throw lo.Error;
else
RequestDetails = lo.Entities.Single();
});
}, null);
}
In this example, the workContext object is the DomainContext. The query is an specific version on the server - you can also just contruct the query client side with:
.Where(r => r.RequestID == requestID)
After the async call, it thows any errors that occurred from the async call, and then returns the only entity returned. If you get more than 1 entity, you might use .First() instead.
If this is not enough to get you going, let me know and I can explain further.