Is any way to execute directional movement (turn left, move forward, etc) for Dronekit?
Thanks.
If you are using Dronekit for Copter you can use velocity control or target positioning, The dronekit documentation explain it very good in here http://python.dronekit.io/guide/copter/guided_mode.html. But if you want to control Plane you can use Channel Override, basically it work by emulating a remote control, They also explain it in the documentation http://python.dronekit.io/examples/channel_overrides.html.
Related
I have a simple app that launches and then draws a color full-screen using EGL/OpenGL ES. I am using DRM and GBM. It works great if I switch to a framebuffer console and launch from there. However, if I try to run it while the X server is active, the DRM permissions prevent me from doing so. I assume this is because the X server already has "master" control over DRM. Is there any way to override this and have a DRM app take over the screen, then return control to the X server once it completes? This would be preferable to having to switch to a console using ctrl+alt+FX first. I am running Ubuntu 22.
Mesa includes a "drmSetMaster" function in xf86drm.h:
https://github.com/freedesktop/mesa-drm/blob/master/xf86drm.h
However, I just get the same "permission denied" error. When I run as root, I get a "Device or resource busy" error.
What you want to do is to use the KMS API to create a new “DRM plane” (a.k.a. “overlay”) and display your content there. The DRM API does not have a mechanism to “take over the screen”, what it gives you is a way to create new planes and display content on those planes. The display controller will then take care of compositing those planes together and displaying the result. Each display controller has its own way of handling planes, but typically the most complex display controllers will give you the greatest control over how the planes are composed. The best way to get started on this is to look at the source code for the DRM drivers in the kernel. For example, the Intel DRM driver contains a full-featured KMS API implementation. This is a large and complex driver, but it should give you a good idea of what’s possible with the KMS API.
I'm trying to move a logitech steering wheel to a certain angle and keep it there using the Force Feedback API. I'm fine with absolutely any programming language, on any platform (Windows, Linux, macOS), if you could please provide me a few hints as to how I can go about implementing this.
Force feedback support is actually in the Linux kernel in most distributions. I think it is best looking into the joystick application to control it. It features joystick control with force feedback support. You can install it on a debian-based distribution like Ubuntu using:
sudo apt-get install joystick
Next to applications to read controller positions (jstest and jstest-gtk) It features a couple of commands to control force feeedback like:
fftest
ffcfstress
ffmvforce
You will need to find out the port where your race wheel can be controlled on. You can do this by entering:
cat /proc/bus/input/devices |less
If connected, you should find your racing wheel with a proper name description and after that something like:
N: Name="Logitech Inc. WingMan Formula Force GP"
...
H: Handlers=js0 event9
Then you can use ffcfstress to let your race wheel oscillate. Fix it securely and enter:
sudo ffcfstress -d /dev/input/event9
You might need to specify the axis which has force feedback which usually only is on the wheel (not on the pedals)
sudo ffcfstress -d /dev/input/event9 -x 6
So I guess now you are interested in the source code, so you should do this to get it:
apt-get source joystick
You will find the source code here:
./utils/ffcfstress.c
You also might find this documentation page about force feedback in the kernel usefull:
Have fun, and please give your result back to the community!
We're building off of the Tower app, which was built with dronekit-android, and flying the 3dr solo with it. We're thinking about adding some sort of collision detection with it.
Is it feasible to run some python script on the drone, basically reading some IR or ultrasound sensor via the accessory bay, and basically yell at the Android tablet when it detects something? That way, the tablet will tell the drone to fly backwards or something.
Otherwise, would we use the dronekit-python libs to do that? How would use a tablet / computer to have a Tower-like functionality with that?
Thanks a bunch.
I was trying to use the DroneKit-Python API to control the movement of a drone. I've been reading what it's in that link, but I can't find what I need. I want to be able to run the code with the dron indoors (and of course outdoors), so I can't rely in the GPS. I've tried to eliminate that part and use only the send_ned_velocity() method (without the propeller). But I couldn't hear a significant change in the movement of the engines.
The only way I can think of is using the channel_override, but it doesn't seem to be the better choice. Can anyone help me?
Thank you in advance.
send_ned_velocity() will only work if you are in guided mode. With Arducopter 3.3, you can only be in guided mode if you have a gps lock. So you aren't going to be able to use this command indoors.
You'll have to wait for 3.4 to be released, then guided mode will be supported without gps. But instead of gps, you will need an optical flow module and a rangefinder installed and configured.
I am very new at this and trying to get an understanding of this. I have read a lot on the DroneKit-Python site trying to figure out how exactly am I able to communicate with it.
Drone I am currently using is Iris+
I have looked more and there are software that already provide this, but I want to be able to control it plus more.
I want to set waypoints, tell it to then fly give the way points and keep going to them. Also, to be able to arm itself, which is in the example, and override the safety mechanism.
Here is the basic of what I am trying to use it for. Have it fly up at a certain time. Go to the waypoints 1,2,3,1,etc.. Then after X amount of time or on low battery go back to launch point and land.
I have found plenty of code that provides what i need to do, though I don't know if it will work and more importantly I don't even know how to start programming for this. Maybe I have the wrong approach in doing this?
I kind of want this to be a light API, so that in the future I can make a simple UI on my phone and insert some coordinates to give it ways points and that is it. I know there is software out there already that does it, but I want to remove the need for touching the drone. I want it to start and end autonomously.
If anyone could help provide some info that much would be greatly appreciated.
Assuming you have no companion computer (Iris+ does not by default), you are OK with running a ground station app (you won't be out of range to send commands to "end mission on time expiry") and that driving the behaviour from your phone is important, I would be looking at DroneKit Android.
Some notes:
You're going to have to touch the drone at some point to attach the
batteries.
You can arm the device from dronekit
You can override the safety mechanism from a script. I hope you have
a lot of money to pay for the new drones you're going to have to buy when they crash and all the litigation from damaged people and property (in other words "don't do it".
The default behaviour is to return the device to launch (RTL) on low battery. This is convigurable
Setting a time is more "problematic". You can have a timer in a script that then sends return-to-launch but the script needs to be connected to the UAV. This means that either you have to be running in a connected ground station (which might potentially be out of range) or on a companion computer.
Iris+ does not have a companion computer. You have to install one or connect from a Ground Control Station.
DroneKit-Python runs on Linux, MacOSX or Windows. You can't just run it on an ordinary phone, though you could find some other mechanism to send messages/scripts to it running on a companion Computer.
DroneKit Android runs on Android. We do have a planned iOS version too. In theory these could run on a companion computer, but in practice currently these are only used as ground stations.