Verifone VX 675 GUI development - verifone

I have a Verifone VX 675 device. I am supposed to develop a GUI on it.
Do I need to have a Verifone ADK ?
Or can I use this folder given to me: Mx800-vfi-toolchain-011a.tgz? (The Mx800 is another device from Verifone).
Can some one please clear my doubt?

It's pretty hard to say from your description what kind of GUI are you building. Vx675 may come with eVo or V/OS operating systems. You may or not have ADK used there as both options are available. You should probably address those questions with the estate owner and the device vendor.

Related

Searching for a method to get Wireless Access Point information programmatically

I've been investigating a way of tracking a device's location in a building, at first I was intending to use iBeacons. However I have been since told that it must be done by monitoring access points and looking for the MAC address of that device.
I can't seem to find any generic sort of library or API that can hook into an access point and give me details. Infact I don't even know where to start looking which is making it even harder.
Has anybody had dealings with this and could point me in the right direction? Any programming language acceptable.
I have written software from scratch to do exactly this, but to my knowledge, no frameworks are available.
The basic steps are:
Get a number of small, low-cost computers (Raspberry Pis work nicely) to act as sensors that do WiFi scanning in promiscuous mode, collecting unique macs and detection times. On Linux, you can use C or Java software to collect these records.
Write the info from the sensors to a server, including the sensor identifier so you know where each device was detected.
Write lots of code to crunch the numbers.
You should be aware of three big issues:
Mobile devices aren't always detectable on WiFi. If they are asleep, or simply not communicating, you will not detect them. On iOS the best you can hope for is detections every minute or so if the device is not locked and not actively using WiFi.
On iOS 8+, mac addresses are scrambled under certain conditions,boften making it impossible to track unique devices.
Building the above from scratch is a lot of work. Think several man months for even a basic system.
I know you were asked to build it this way and not with beacons, but beacons do provide a much simpler path forward if you can ensure an app on each device and can revisit this design constraint.
I suggest you to do this with iBeacons.
But :
On IOS 8 , when u try in anyway to get mac adress from a device , you obtain this value : 02:00:00:00:00:00.
The best way to get an unique identifier for a device is to use the identifierForVendor method form UIDevice.
Like this :
UIDevice *device = [[UIDevice alloc]init];
NSString *uniqueIdForDevice= [NSString stringWithFormat:#"%#", [device identifierForVendor]];
NSLog(#"%#",uniqueIdForDevice);
That gives you an ID that's unique for that device for your company.
I hope this can help you.
I'm not sure what the constraints are on your particular design but there are existing systems that allow you to get the kind of monitoring I think you're looking for. As davidgyoung writes, there are a few technical challenges that exist at OS level that will be present in whatever wi-fi solution to choose (e.g. iOS MAC address rotation). That aside, you might get some value from looking at solutions from Wi-fi hardware manufacturers like Cisco: https://meraki.cisco.com/solutions/cmx . They have a pre-built platform for visitor data (i.e. showing you where phones are in buildings/spaces). I think all the major wi-fi hardware manufacturers have something similar now and Cisco are likely to be top-end. From memory, Aruba were much cheaper (I'm going back ~10 months).
There are also software providers like Euclid Analytics ( http://euclidanalytics.com ) who build on top of the hardware and API's of providers like Cisco to provide visitor info like I think you want.
This isn't an exhaustive list as I'm writing this from memory but hopefully a bit of Googling based on the above should give you a better chance of success than rolling your own.
Good luck,
James
If you want to do indoor location services, then I would recommend checking out Cisco Connected Mobile Experience software.
You can try it free.
Based on my analysis, it is the best solution out there. I am biased because I work there. But, I do competitive analysis and have yet to find anything I think works better.

Creating a touch screen driver for OS X: where to start?

OK, so I recently purchased an Acer T232HL touch screen display to hook up to my Macbook Pro as a secondary monitor. To give you an idea, here's my setup.
OS X doesn't support this monitor in any way, so as you can see in the screenshot I'm actually running Windows 8 through VMware, which proxies the USB connection to Windows perfectly where the touch events are supported. But obviously this isn't ideal.
There's at least one 3rd party driver for OS X that looked sort of promising, but it doesn't seem to support multitouch from this device, it's expensive, and generally was a pain to get working to the small degree it was. There's also mt4j but best I could tell after running their examples, it doesn't support this device at all.
So here's my question: what exactly am I in for if I wanted to write a driver for this thing? I'm mostly a web developer with years of experience with Ruby, Objective-C (and a little C), Javascript, etc. I have never ventured into any kind of hardware programming, so from the surface this feels like an interesting while intimidating challenge.
I know at some level I need to read data from USB. I know this will probably mean trying to reverse engineer whatever protocol they're using for the touch events (is it possible this will be entirely custom?). However I haven't got a clue where to start - would this be a kernel extension? In C, I presume? Would love a high level overview of the moving parts involved here.
Ultimately I want to use the touch screen to drive a specialized web interface (running in Chrome), so ideally I could proxy the touch events straight to Chrome without the OS actually moving the mouse cursor to the touch location (so have the UI behave just as it would on an iPad), but regardless of whether this is technically possible, I'd love to start with just getting something working.
You're going to want to start with Apple's I/O Kit documentation. You can hope that the touchscreen isn't completely custom, though there must be some part that's not standard USB HID, or it would work already. If there are any linux (or other open source) drivers available, you'll have the advantage that somebody already did some of the reverse engineering for you. As an alternative to the I/O Kit, you might also want to look into libusb, which might make your brain hurt less when getting started. If you end up needing to write a kext, that might not help you anymore, though.
As to some of your specific questions:
would this be a kernel extension?
Maybe, maybe not. I'm not really up on the Mac OS X driver situation, but I did write some totally user-space USB code for OS X many years ago. Maybe you'll be as lucky.
In C, I presume?
Probably. I/O Kit itself is written in a subset of C++, so you can probably use that too, if you prefer.
is it possible this will be entirely custom?
Unfortunately, yes, it's possible.
Good luck!

Writing a driver to fool Linux systems about having a GPU

I'm into something about writing a "Mock GPU driver" for Linux based systems. What I mean is that, simply I want to write a driver (Behind X-server obviously) to answer X's API calls with some debugging messages.
In other words I want to fool Linux about having an actual GPU. So I can make a test-bed for GUI-accelerated packages in console based systems.
Right now, if I execute a GUI-accelerated package in Linux console based systems; it'll simply dies due to lack of a real GPU (or a GPU driver better I'd say).
So I want to know:
Is it even possible? (Writing a GPU driver to fool Linux about having an actual GPU)
What resources do you recommend before getting my hands dirty in code?
Is there any similar projects around the net?
PS: I'm an experienced ANSI-C programmer but I don't have any clue in real Kernel/Driver development under *nix (read some tutorials about USB driver development though), so any resources about these areas will be really appreciated as well. Thanks in advance.
What you are looking for is actually part of Xorg server suite, and it is called Xvfb (virtual framebuffer).
If you're not afraid of a bit complex bash, you can take a look at Gentoo's virtualx.eclass for an use example (we use it to run tests which require X11).
A good place to start is the Mesa project - it implements OpenGL in software. It has a way to trick the OS into thinking that it is the OpenGL driver.

Resources on how build your own Mobile Phone?

Now that webOS is opensource I am trying to find any resources on building your own mobile phone in the US. That is put webOs on some custom hardware that has 3G voice access.
I realize this question is not a programming question but I could not find another StackExchange that was applicable.
I would suggest looking at OpenMoko and their history of attempting to release phones based on open hardware specifications for open source mobile operating systems.
Per the wiki article, Openmoko phones now support Android, Debian, Gentoo, Qt Extended Improved, QtMoko, and SHR. The announcement of the webOS opensourcing means that it's possible that webOS could be ported, as well.
You're likely going to have to wait a bit before using just any hardware. While webOS is going open source, right now the only thing that's released is Enyo (the application framework, minus the UI elements). There are other components to the OS that are still unreleased, and the OS runs on a modified linux kernel. They do plan, however, to release a version by the end of this year, called open webOS 1.0, which will run on a standard linux kernel.

Are there any ARM based systems/emulators with a graphical frame buffer that allow for (relatively) legacy-free Assembly programming?

I am looking for a modern system to do some bare bones Assembly programming (for fun/learning) that does not have the legacy burden of x86 platforms (where you still have to deal with BIOS, switching to protected mode, VESA horrors to be able to output pixels to the screen in modern resolutions/colordepths etc.). Do such systems even exist? I suspect it is not even possible today to do low-level graphics programming without dealing with proprietary hardware.
qemu is likely what you want if you dont want to have to build that stuff in. You wont get as much visibility as to what is going on in the guts of it.
For hardware, beagleboard (dont get the old one get the new one with reasonable connectors, etc), or the open-rd board. I was disappointed with the plug computer thing. The hawkboard I like better than the beagleboard, but am concerned about the big banner about a pcb design problem. The raspberry pi will be out at some point and will also provide what you are looking for. Note that for beagleboard, etc, you dont have to run linux or anything like that, you can write your own binary and xmodem it over or use the network and then just run it, not a problem at all.
The stellaris eval boards all/most have oled displays, monochrome and small but graphics, not sure how much you were after.
earth-lcd used to have an arm based board with a decent sized panel on it.
there is of course the gameboy advance and the nintendo ds. flash/developer cartridges are under $20. the gba is better to start with IMO, as the nds is like two gbas competing for shared resources and a little confusing. with a ez flash cartridge (open source software to program), was easy to put a bootloader on the gba and for like another $20 create a serial cable, I have a serial based bootloader for loading the programs. If you have an interest in this path start with the visual boy advance emulator to get your feet wet and see how you feel about the platform.
If you go to sparkfun.com there are likely a number of boards that either already have lcd connectors that you would mate up with a display or definitely displays and breakout boards that you could connect to a number of microcontroller development boards. Other than the insanely painful blue leds, and the implication that there is 64KB (there is but non-linear 32KB+16+16) the mbed board is nice, up to 100mhz, cortex-m3. I have some mbed samples at github as well that walk you through building an arm binary too boot an arm from flash for those that have not done it (and want to learn that rather than call some apis in a sandbox).
the armmite pro and the maple (sparkfun) are arm based arduino footprint platforms, so for example you can get the color lcd shield or the gameduino
There is the open pandora project. I was quite dissappointed with the experience, after over a year paid another fee to get the unit and it failed within a few minutes. Sent it back and I need to check my credit card statement, maybe we took the return and give it to someone who wants it path. I have used the gamepark gp32 and gpx2, but not the wiz, the gpx2 was fine other than some memory I/O problem in the chip that caused chaotic timing. the thing would run just fine but memory performance was all over the map and non-deterministic. the gp32 is not what you are looking for but the gpx2 might be, finding connectors for a serial cable might be more difficult now that the cell phone cable folks used to cut up is not as readily available.
gen 1 ipod nanos can still be had easily, as well as the older gen ipod classics. easy to homebrew, the lcd panels are easy to get at. grayscale only, maybe only black and white I dont remember. All the programming info is had from the ipodlinux folks.
I have not tried it yet but the barns and noble folks are homebrew friendly or as friendly as anyone on that scale has been so far. the nook color can easily be turned into a generic android device, so I assume that also means you could develop homebrew on the metal, not sure though, have not studied it.
You might look at always innovating, my experience with them was similar to the open pandora folks. These folks started with a modified beagleboard in a box with a display and batteries, then added a couple more products, any one of them should be very open, and homebrew friendly so you can write whatever level you want, boot and run on the metal, no problem. For the original product it was one of those wait for several months things.
I am hoping the raspberry pi becomes the next beagleboard but better.
BTW all hardware is proprietary, it is just a matter of whether they choose to provide programming information or not. vesa came about because no two vendors did it the same way, and that has not changed, you have to still read the dataseets and programmers reference manuals. But as you can see above I have only scratched the surface, and covered the sub or close to $100 items. If you are willing to pay in the thousands of dollars that greatly opens the door to graphics based development platforms that are well documented and relatively sandbox free. many are arm based since arm is the choice for phones, etc and these are phone-like, tablet-like, eval platforms.
The Android emulator is such a beast; it runs a linux kernel and driver stack (including /dev/fb) that one can log into via the android debugger bridge, and run (statically linked) arm-linux-eabi applications. Framebuffer access is possible; see example.
The meta-question rather is, what do you mean by "low-level" graphics programming; no emulator is going to expose all the register and chip state complexity that's behind a modern graphics chip pipeline. But simple framebuffer contents manipulation (pixel buffer access) is surely simple enough, as is experimenting with software rendering in ARM assembly.
Of course, things that you can do with the Android emulator you can also do with cheap physical ARM hardware, like the beagleboard and similar. Real complexity only begins when you want to access "advanced" things - that's anything accelerated functionality beyond just reading/writing framebuffer contents.
New Answer
I recently came across this while looking for emulators to run NetBSD on, but there's a project called GXemul that provides a full-system computer architecture emulation with support for a variety of virtual devices and CPUs. The primary and most up-to-date core looks to be MIPS-based, but it also lists support for emulating the ARM architecture. It even includes an integrated debugger and it sounds like you can just assemble your code into a raw binary with some bootstrapping code and boot it as a kernel inside the emulator from the commandline.
Previous Answer
This isn't an emulator, but if you're interested in having a complete, ARM-based computer that you can develop whatever you want on that doesn't cost much, you should keep an eye on the Raspberry Pi project. They're very close to selling a complete, tiny, low-power ARM-based computer for $25 a piece. It has USB ports, ethernet, video out, and an SD card reader, and can boot Linux, although in your case you'd probably want to boot your own code and access the hardware directly.
EDIT: Looks like Erik already mentioned it.

Resources