Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I just watched the Super Mario Bros. -1 World glitch in youtube and I really began wondering about the code behind those games. Which language was used? What about the OS for the video games consoles? Are there any website with resources about this subject? (I am a 90s video gamer so I am particularly interested about the programming behind those games but feel free to make this a wiki and include links to resources about video games programming in general, if you want)
Having somewhat worked on an emulator for the NES (I have it decoding some opcodes, but none of the other hardware is emulated), I can maybe share a few answers.
For most games assembler was used. Optimizing compilers, if available for the CPU were nowhere near as good 20-30 years ago as they are today. To get performance, you needed to write in assembler (This even held true on the PC. Parts of Doom are in ASM). All the more so, since the NES CPU ran at less than 2MHz. Also, memory was more expensive then than it is today. The original Mario was stored in about 40k of memory. 16k of that was the actual code, and the remainder was the graphics and sound resources.
Until the 32 bit console era, any sort of operating system, or even built in utilities, on a console was uncommon (Sega CD was the one of the few in the 16 bit era with an actual BIOS, and there was a small program burned into the Game Boy's processor that was responsible for the Nintendo logo scrolling down on power on). See above about size constraints, as a main reason. When inserting the cartridge, the ROM chip in the car was connected directly to the address bus of the CPU. On power on, the CPU would read from a fixed address to get the actual address the program started at, and then jumped to that location and started execution.
As for resources, the NES Dev Wiki has resources concerning the NES hardware, along with programmming references. Zophar's Domain also has technical documents and public domain ROMs for quite a few console (I don't know if I should link to ZD on this site, just google it)
Most of the older consoles had some kind of BIOS ROM.
Some of the source code for these are online:
You can read the mostly-commented disassembly to the 7800 BIOS: http://atarihq.com/danb/files/7800bios.asm
The Atari 5200's BIOS source is more interesting, since it does more than just initialize the system and display a splash screen: http://atarihq.com/danb/files/5200BIOS.txt
The Colecovision had an 8K (!) BIOS ROM as well; it's source is here: http://xi6.com/code/coleco/coleco29.asm
The Odyssey II BIOS source is here: http://atarihq.com/danb/files/o2romsrc.txt
The Intellivision had an OS called "exec," can't find a disassembly online, though I did find a bunch of info about it: http://www.intellivisiongames.com/bluesky/hardware/intelli_tech.html#exec and http://www.beeslife.com/faq.htm#_Toc140592020 - it had routines to move sprites, read controllers, and calculate square roots!
Most of the glitches in that video are tile based glitches, where there are bugs in the collision detection of the tile maps that make up the levels. All levels are made up of square shaped tiles. If you notice mario is always between tiles where he shouldn't be.
Back in the late 70s, 80s and early 90s, most software (including games) were written in ASM (Assembly). If you are unfamiliar with Assembly, it is practically a very low level programming language that is hardware specific for programming the CPU. This means that you had to control every pixel on screen and recreate libraries for things like physics, graphics, and even sound! You were very limited in memory so recycling was a must. In the original Super Mario Bros you will notice that the clouds are the same as the bushes, the only different is the color. A lot of sprites were recycled and the game physics were limited.
As games became more and more complicated developers moved onto the C language which allowed for software to be written a lot more quickly because it required less lines of code. Nowadays a lot of console and computer games are written in C++ because it allows for faster development but also allows for the software to be close enough to hardware for faster play.
I havn't done research about this, but Super Mario Bros and releated 90's games are available as .nes files instead of cartridge and there opensource emulator are also available.
AFAIK, these are generally written in C++. I don't know about legality of these nes files and emulator, but they available on internet. you have search with right string!
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I'm about to start a new project on a classical STM32L4 based product.
I have a good experience in ARM developpement but not in STM32 specifically.
I am wondering what is the quality and performance of the STM32 HAL and low level drivers provided by STmicro (in the package STM32Cube).
I'd like to gather developpers experience and feedback on the topic.
Basically I'd like to know if you are happy with this code or on the contrary if you encounter some issues, if some of you developped their own drivers for some reasons, etc...
Thank you !
I do not like HAL for many reasons:
It gives pseudo developers false feeling that they do not have to know how their hardware works.
Time spent learning HAL may be longer (and usually is) than needed to understand how the hardware works.
Horrible overhead
Many errors.
But on the other hand I use HAL (actually deeply modified by me) to control two peripherals USB & Ethernet as writing could take too much time. But as I wrote before I know how does it work on the hardware/low level and modified it for my liking.
After the transition from smaller 8-bit microcontrollers to ARM, I've started to use the HAL library on STM32 right away and had a more or less satisfying experience. But it comes with an overhead like already stated and a quite large set of poorly documented functionality. That can lead to some confusion.
However, the big advantage of using the HAL over hand-written-code-from-scratch was the level of abstraction it provides. That came in handy when I needed switch from one type of STM32 to another; and also when I needed to get things up and running quickly. - I've used quite similar code on a couple of very different types / families of STM32 micros (L0, L1, F1, F4, F7); it actually worked most of the time. Using the HAL library made the transition much less painful, than when you need to know the exact memory map and register layout of the specific micro...
With that said, I need to admit that I'm still a newbie when it comes to modern embedded software and I'm still learning, after about 2 years of prototyping work on different STM32 based projects (hobby and professional). I still need to learn more about the provided LL code, for example.
Entering the embedded field with a different software background, using HAL level code instead of twiddeling single bits of different registers in the right sequence, and taking all the different restrictions into account to get for example basic UART / SPI / I2C communication working, eased things up quite a bit for me. IMHO, the STM32 HAL lies in a middleground between pure hand written code and what mbed does for example (C++ / vender-agnostic abstraction (as far as I know)). - It tames the complex beast to an acceptable level, so that an average software developer like me can handle it. That comes with some trade-offs, like already mentioned by others...
After all, the STM32 HAL also kind of serves as a boiler plate code repository, that can sometimes be easier to read/understand than the cryptic reference manual in some cases. - Using HAL code generated by STM32CubeMX always gave me a much smoother start at bring-up time, when I needed quickly test a new board. It can also help to experiment and test things out. And when a performance critical part needs to be hand-optimized later on, then that will still be possible after setting up a project, or even incrementally adjusting it with STM32CubeMX. You can mix handwritten code with HAL code.
Some problems recognized since 2016:
Some constants, structs and function signatures changed when new code updates were released by ST. Things seem to be in constant development.
Lack of good documentation (comments in code files) and clean example code (too specific, also not well documented).
Convoluted, sometimes inefficient code.
Spelling errors.
I personally do not like HAL library for following reasons.
It takes up much memory in my controller, I really do not have space where i would fit in Bootloader and Application and here i need to add 2 HAL overheads as well ( one in Bootloader and another in Application).
It internally uses interrupts ( i am pretty sure it does)
It is not bug free , i once tried version 1.0 and failed horribly.
Debugging is pain , You never know where the bug is, in your application or in HAL.
What I liked by ST was Standard Peripheral Library, it was just assembly to C converter and very easy to use.
I love the HAL and Cube, but you have to read the drivers and be prepared to suffer.
I used to wiggle bits like the naysayers, you can pick your poison.
In my situation, if I use the HAL, I can sucker a real programmer into maintaining my code. Say no more, I'm in on the HAL. Be forewarned, the Cube simply creates an approximation of something that works.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I followed these instructions and was successfully transmitting IEEE 802.15.4 frames on a GINA Mote. I know it was working because I have a packet sniffer that captured transmitted packets.
Here is the source code: https://github.com/openwsn-berkeley/openwsn-fw/tree/d1ec9982fbc101061b4bc70bde239e54cd1367c4/firmware/openos/bsp/boards/gina
I'm a little confused how and why it's working though. Is this code loading an operating system (like RTOS) on the Gina mote or is this project OS-less ?
I'm looking for a solution that does not require an OS / bootloader.
I would appreciate if one of the experts in the community could weigh in on this.
The JTAG adapter rams the executable image up the MSP430 processor's butt, sets up the MSP430 to start executing at the image's start address, and lets 'er rip. That's it. There ain't no OS, and there's no code onboard the little processor board required for loading the executable image. Your program is the only code it ever knows. (And the JTAG adapter probably burns the code into the processor's flash, so it stays resident even when the JTAG adapter is removed.... and starts executing again any time the processor is reset.)
Now, you may wonder... There may be C runtime facilities available that you might think are associated with an operating system... perhaps printf(), malloc(), new, etc. Those are part of the C runtime & I/O subsystem, and can of course be implemented for a custom platform with no OS.
UPDATE: Hmm. What I mentioned above was true when I played around with small MSP430's back in 2008. At that time I only recall IAR, I don't recall there being mspgcc. I believe the IAR solution is as I described above. The mspgcc solution seems to involve a "BSL" (bootstrap loader), per this web page. Or perhaps the BSL is just pre-loaded on the MSP430, and even IAR uses it... I dunno. In any case, with either the IAR or mspgcc toolchain, ultimately you should be able to burn your program into the processor's built-in flash, and once burnt into it, you can remove your JTAG programming/debugging adapter, and from then on, the CPU will automatically run your program every time it boots.
I am working on a project and my team is responsible for the software stack of the particular hardware.
I only have the instruction set of the processor in my hand and I need to develop the complete software stack with it.
Do I require anything else other than the instruction set for the assembler?
Please note that I am not aware of the organisation of the hardware of that computer
The very short answer is "probably not possible without further information".
At the very least, you will need to know where different types of memory is located, what you need to initialize within the processor itself [this is typically not in the Instruction Set Manual]. Typical examples: interrupt vector, timers, memory controllers, etc that are often part of the processor itself, but not really part of the instruction set.
Obviously, the software stack for a digital wrist-watch is pretty basic. The software stack for a complete Home Entertainment system with the ability to stream encrypted video and also browse the web would be quite large. The software stack for a mobile phone even larger. The requirements for a wrist-watch and building the hardware necessary to do that [not saying you can get it small enough easily, but ignoring that since this is a software, not a hardware question], would probably take a few days at most. A smart-phone that is able to compete with at least some success with the top of the range products on the market today would take a large team of very skilled software developers a couple of years to complete. Obviously, there are a lot of other software based systems somewhere between those relatively extreme examples.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I and my Friends have been trying to custom build a mid-range mobile phone. We are thinking of porting linux to it and modify it as per our requirements.
Now, The problem is that we are unable to decide about which processor to use - weather to use ARM or any other and if ARM then which ARM architecture. It would be great if someone could also suggest which linux variant to port.
OUR AIM: We want to build a device in this category, please please follow the link:
Please Click here
Please give your valuable inputs, it will help a great deal to us youngsters.
Thanks and Regards,
Avi and Co
Are you seriously getting into the phone business or is there some other real goal you are after (learning how something works, wanting to create a phone user interface, etc)? Getting into the phone business means tens to hundreds of millions of dollars, per product, of development money. Armies of lawyers to figure out the dozens of companies you are going to have to pay patent royalties to (do you follow slashdot.org?), etc. With that budget you already have the money to buy each of the popular eval boards and try them out. Likewise the software staff to try each of the linux porting methods.
Not to completely discourage you though.
ARM makes processor cores, but not chips. Other companies take an ARM core, wrap it with something interesting and sell that chip as a product. There are many websites that show the deconstruction/disassembly of quite a few phones and other products. You need to research this, I would be surprised, but if there are phones that use commercially available processors, those are the chips you should look for. I suspect most are going to be custom made for that phone or phone vendor and you are likely not going to be able to even get a datasheet much less a way to buy them on a board (other than buying a phone of course).
ARM is a very good choice for phones, there are many good reasons why ARM is used in phones and most other handheld devices. Go to ARM's website and look at the cores availble. Compare that to what websites claim phones or other similar devices (ipad, kindle, nook) are using. Then search around for companies that have chips with those cores. You will probably just end up looking at the ti omap or the marvell chip in the openrd or plug computer (I dont know its name/model off hand, kirkwood or something like that). The nvidia tegra is the hot new chip for phones, I think they have an eval board.
If you have not already ported linux to ARM then I would actually suggest working on that against QEMU, and wait on purchasing any hardware. As already mentioned a little googling goes a long way. You can even create and test your phone user interface software without needing to purchase any hardware. By surfing around getting a feel for what lcd panels, etc are used by various vendors you can get a feel for what your software is going to be limited to or have to do. It wont cost you anything but time to figure out what you do and dont want or will or wont support without having to spend any money on hardware.
I would compile everything for a generic arm at first (armv4 for example) and worry about tuning for the particular core later, if at all. Sure, ultimately you are going to need to get a feel for the performance of that core and caches and mmu, etc. worry about that later you have a lot of research to do and software ground work first. If you must get a beagleboard or openrd (with all the extras you have to buy to make a beagleboard usable/useful, the openrd is cheaper, faster, etc). (avoid the plug computer, doesnt fit with what you are trying to do anyway).
There are many good reasons why linux is not used on phones, so you need to research that as well and decide if that is the path you really want to take.
You are likely going to end up with an mpcore or cortex-Asomething. So the ti opap, nvidia tegra, marvell sheeva, etc, all fall into that category. You can get the feel for a cortex A series from any one of them.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I have to choose a thesis topic soon and I was considering implementing an operating system for an architecture that is not x86 (I'm leaning towards ARM or AVR). The reason I am avoiding x86 is because I would like to gain some experience with embedded platforms and I (possibly incorrectly) believe that the task may be easier when carried out on a smaller scale. Does anyone have any pointers to websites or resources where there are some examples of this. I have read through most if not all of the OSDev questions on stack overflow, and I am also aware of AvrFreaks and OSDev. Additionally if anyone has had experience in this area and wanted to offer some advice in regards to approach or platform it would be much appreciated.
Thanks
Developing an (RT)OS is not a trivial task. It is very educational though. My advice to you is to start hardware independent. PC is a good starting point as it comes with plenty of I/O possibilities and good debugging. If you create a kind-of-virtual machine application, you can create something with simple platform capabilities (console output, some buttons/indicators are a good start). Also, you can use files for instance, to output timing (schedules) If you start on 'bare metal' you'll have to start from scratch. Debugging on a LED (on/off/blinking) is very hard and time consuming. My second advice is to define your scope early: is it the scheduler, the communication mechanisms or the file systems you're interested at... ? Doing all can easily end up in a life long project.
Samek, Miro, Practical UML Statecharts in C/C++ contains some interesting sections on a microkernel. It's one of my favorite books.
Advanced PIC Microcontroller Projects in C: From USB to RTOS with the PIC 18F Series
seems to cover some of your interests; I haven't read it yet though. Operating Systems: Internals and Design Principles may also bring good insights. It covers all aspects from scheduler to network stack. Good luck!
Seems like you should get a copy of Jean Labrosse's book MicroC/OS.
It looks like he may have just updated it too.
http://micrium.com/page/press_room/news/id:40
http://micrium.com/page/home
This is a well documented book describing the inner workings of an RTOS written in C and ported to many embedded processors. You could also run it on a x86, and then cross compile to another processor.
Contiki might be a good thing to research. It's very small, runs on microcontrollers, and is open source. It has a heavy bias towards networking and communications, but perhaps you can skip those parts and focus on the kernel.
If you choose ARM, pick up a copy of the ARM System Developer's Guide (Sloss, Symes, Wright). Link to Amazon
Chapter 11 discusses the implementation of a simple embedded operating system, with great explanations and sample code.
ARM and AVR are chalk and cheese - you've scoped this very wide!
You could produce a very different and more sophisticated OS for ARM than AVR (unless you are talking about AVR32 perhaps - which is a completely different architecture?).
AVR would be far more constraining to the point that the task may be just to trivial for the scope of your thesis. Even specifying ARM does not narrow it down much; low-end ARM parts have small on-chip memories, no MMU and simple peripherals; higher end parts have an MMU, data/instruction caches, often a GPU, sometimes an FPU, hardware Java bytecode execution, and many other complex peripherals. The term 'ARM' covers ARM7, ARM9, ARM11, Cortex M3, Cortex M8, plus a number of architectures intended for use on ASICs and FPGAs - so you need to narrow it down a bit perhaps?
If you choose ARM, take a look at these resources. Especially the Insider's Guides from Hitex, and the "Building bare-metal ARM with GNU", they will help you get your board 'up' and form starting point for your OS.
Silly as it may sound, I was recently interested in the Arduino platform to learn some hacking tricks with the help of more experienced friends. There was also this thread for a guy interested in writing an OS for it (although not his primary intention).
I think the Arduino is very basic and straightforward as an educational tool for such endeavors. It may worth the try checking it out if it fits the bill.
The first thing I recommend is to narrow your thesis topic considerably. OSs are ubiquitous, well researched and developed. What novel idea do you hope to pursue?
That said, the AvrX is a very small microkernel that I've used professionally on AVR microcontrollers. It is written in assembly. One person started to port it to C, but hasn't finished the port. Either finalizing the port to C and/or making a C port to the AVR32 architecture would be valuable.
An OS shall not be tightly coupled to any processor so ARM or x86 doesn't matter.
It will be a bigger topic, if we start discussing if ARM is embedded and x86 is not. Anyway, there are many many places in which x86 processors are used for embedded software development.
I guess most of the kernel code will be just plain C lanugage. There are many free OS that are already available, like for example, embedded linux, Free version of Itron, minix, etc ... It will be a daunting task.
But on the other hand, what you can try is, port embedded linux to platforms in which it is not yet working. This will be really useful to the world.
An RTOS is almost never architecture specific. Refer to any RTOS architecture available on the net and you will notice that a CPU/Hardware abstraction layer abstracts out the CPU. The board specific portions (that deal with peripherals such as com ports, timers etc.) are abstracted by a board support package.
To begin with, get an understanding of how multi-threading works in a RTOS try implementing a simple context switch code for the CPU of your choice; this will involve code for creating a thread context, saving a context and restoring a saved context. This code will form the basis of your hardware abstraction layer. The initial development can easily be accomplished using a software simulator for the selected CPU.
I agree with the poster who suggested reading the book, uCOS-II by Jean Labrosse. Samples of context switch code, especially for x86, should be just a google search away!
http://www.amazon.com/Operating-Systems-Design-Implementation-3rd/dp/0131429388
Pretty solid stuff.