|
That joke gives me a Vorgon poetry vibe:
Quote: Like jowling meated liverslime, Groop,
I implore thee, my foonting turlingdromes,
And hooptiously drangle me,
With crinkly bindlewurdles,mashurbitries.
Advertise here – minimum three posts per day are guaranteed.
|
|
|
|
|
Racing the beam is a way to put graphics on a screen without having any graphics memory. It was used in the 1970s when memory chips still had a tiny capacity and cost their weight in gold or more. The Atari VCS is a well known console that used this. It had only 128 bytes of RAM, the programs were on ROMs in the cartridges, so no room at all for any video buffer.
It is called 'racing the beam' because most of the time the processor is busy staying ahead of the electron beam of the CRT monitor and putting the graphics data that will be displayed next directly into the registers of the graphics chips just in time. Be to quick or too slow and you have only garbage on the screen. And such luxuries as actual gameplay had to wait until the graphics chip was done with the current frame and entered the vertical blank period before starting with the next frame.
Horrible fragile code and a nightmare to debug. Even proper debugging tools as we know them did not exist yet. But programmers who can deal with such old stuff are afraid of nothing.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Being an expert of the 'you have only garbage on the screen' part, I fear nothing.
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
Yes, in a variety of scenarios. On the Commodore 64, you could get an interrupt at a specific raster line, and one of the geniuses I worked with figured out that you could double the apparent sprites by interrupting halfway during the vertical rendering of the screen and switch the sprite bank pointers. Flip back during vertical refresh.
I also hand coded, counting 80286 instruction cycles, the assembly code necessary to flip a video digitizing board from "read" to "write". See, we had this multispectral camera with a spinning disk of 6 bandpass optical filters in front of the CCD sensor, where the rotation of the disk with the glass filters was sync'd to the vertical refresh rate of the CCD (the flip side of racing the beam). So, 1/60th second, you'd get a different image of a different filter, which was something of a visual mess when looking at different spectrum slices.
I figured out how to put the digitizer board into "read" mode for one field and "write" mode for the other 5, so you could get a stable image real time of a specific filter. All that had to be done during the vertical refresh period.
|
|
|
|
|
C64 raster interrupts were too much fun. You could change video modes, or increase the number of colors that could be displayed on the screen at one time using the raster interrupt. Way too much fun.
|
|
|
|
|
Never used it as such, heard about, used a similar system for RF units using a PIC. All the data was in an array that was picked off and broad cast, issue was the RF unit was too slow for the array pointer, so tricks had to be used to slow operation so that the pointer picked up the next value. I miss those days of looking at a scope and your code trying to figure why? I'm odd I miss those days.
|
|
|
|
|
I've not done something quite as tricky as that, but I have resorted to similar tricks on an IBM CGA adapter.
The CGA adapter's memory was not double-ported, so writing to it as arbitrary times would interfere with the adapter's access to the memory, causing "snow" to appear on the screen. The BIOS got around this by updating the memory only during vertical refresh, but that was slow. It was discovered that you could just about write one byte to memory during the horizontal refresh, speeding up output considerably.
This had to be done while interrupts were disabled, so you wouldn't miss the window. Ensuring that no interrupts were missed during the screen update made this interesting...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Even dual ported video memory did not solve this problem entirely. There still had to be a hardware mutual exclusion logic to prevent conflicts. Some graphics chips, like the Motorola MC6847 were so nice to have a signal that tells when it's accessing video RAM and when it's not. That practically was all you needed for mutual exclusion. Letting a graphics chip trigger an interrupt upon entering the vertical blank was another option. This would automatically also eliminate the problem of any other interrupts since you already were in an interrupt routine.
But, of course, that also opens another can of worms anywhere where nested interrupts are allowed or things like non maskable interrupts are a thing.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Can you imagine if the interrupter gear that allow WWI aeroplanes to fire the machine guns through the propeller were controlled by such software? A missed interrupt and... BANG! There goes another propeller!!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
Some things are better handled by hardware...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Memory lane.
Tektonix used to have 4010 and 4014 graphical display terminals using storage tubes as their display memory. They were like an electronic etch-a-sketch. Clear the screen then draw on it much like a line plotter. Used them a-lot in Grad school as terminal for mini-computer (rack mounted Data General Eclipse). Computer graphics was becoming a core course for computer science and math majors at the time. Anyway, point is that this was early days of graphical displays that did not require lots of expensive dedicated raster memory (those types of displays were very expensive and used mostly by CGI business (movies).
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Yes, but not a CRT: cardboard boxes going down a conveyor.
I used to design coding and marking equipment for batch coding goods and boxes of goods as they go down the conveyor. Limited processor (32MHz Z80) and even more limited RAM (32K), but 128 individual dots (35 picolitre per dot)to print in a single vertical slice, then a brief pause, and a second slice, and so on. Most companies get the signal from the photocell, and build a bitmap image to print. But that uses RAM I hadn't got, and takes time on a slow processor. So instead, I got the photocell signal, grabbed the date and time (could be part of the print) and started an timer interrupt. Each time the timer ticked, I generated a new slice and fired it out the jets then set up for the next one.
Basically deferred the processing from a lump at the start to a little bit each dot.
Worked brilliantly on boxes of product, but was just a fraction too slow to cope with high speed lines of individual products - it would reach the point where the processing took longer than the timer interval and the message would stretch across the product.
And I still see units that I designed and coded are working fine even today walking round the supermarket ... I recognise the font I used and it's slight imperfections!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
LOL I never knew it was called that but I race the beam in IoT. Of course now I have a coprocessor to help me.
RGB interface displays use 1 wire for each color bit. Those wires source bits out of RAM while the display scans up and down, left to right. You have to spit the bits at exactly the right time to get the display to show correctly.
I use a little feature in the ESP32 called GDMA to make it possible, but it takes the chasing away from the CPU. Basically you can connect 1 bit to a 1 wire - up to 16 at a time, and point to a memory buffer, and the GDMA processor at a frequency you give it, will read or write data to or from that buffer using those wires.
Aside from that, I've emulated racing the beam several times, building old school emulators, like Nintendo emulators.
Fun times!
Check out my IoT graphics library here:
https://honeythecodewitch/gfx
|
|
|
|
|
Not racing the beam as such, but spotting the beam, yes.
In around 1981 I built myself a UK101 kit computer; 6502 CPU, (originally) 4k memory and RF output to a monochrome television of 32 (originally 16) rows and 64 columns. Complete with full logic diagram. I'd made various mods to the system, but decided it would be cool to be able to "draw" directly on the screen. This in the days before mouse pointers, tablets, touch-screens etc. I knew the image on the CRT was a bright dot racing across the screen and figured if I had a light-sensitive diode, I could trigger a signal in response to the dot passing under it. That signal was connected to an interrupt and the interrupt processing code accessed what was effectively a hardware tick counter, that was synchronised to the clock for the video driver. Based on the value of that tick counter I could tell where the electron beam would be, and therefore I could calculate a character row and column.
So long as there were some pixels in the character, and a little adjustment to the TV brightness controls, it could detect the position of the light-sensitive diode pretty accurately. Fit the diode in the end of a "wand" and, hey presto, I could draw lines on the screen.
As you can see my grasp of it all was a little tenuous, but the excitement and joy when it actually worked was amazing... especially at the total cost of a few pennies and a couple of dozen lines of assembler code.
|
|
|
|
|
I built my first computer in 1978 and still have it. So you essentially built a light gun. I wonder, does shooting the (nonexistent) bea still work on modern monitors? do some game consoles even have light guns? It seems like I saw the last ones some time in the last millenium.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
IIRC, the NES one and some others will not work with modern TVs because they depended heavily on the CRT tech itself. We thought of it as "shooting the TV" but I might recall (or confuse with something else) reading the gun was technically being shot by the beam from the TV and then sending the angle of that back to the console and it would deduce from that where the gun was aimed.
There are newer ones (Wii and PS4 had them for sure) but they are based on different techs. The PS4 had these big balls of light on the controllers and a camera watched them. The Wii used some kind of IR system with a bar you put in front of the TV.
I suppose games are still "racing the ray" in a sense. It's just you don't get garbage on screen, rather, a frozen screen/choppy framerate if frames aren't coming fast enough.
|
|
|
|
|
Light guns (or light pens) Were not very complicated. you could build them yourself with some cheap parts from Radio shack. All you basically needed was a photocell, a button and a toy gun to put these into. Many graphics chips simply had registers that told the current position of the electron beam. When the sensor in the gun detected the electron beam, it must be pointed at exactly these screen coordinates. No wild calculation of angles or anything like that. But this of course does not work when there is no electron beam to detect.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Wow, this really took me back 40+ years. While I never "raced the beam" I did plenty of other coding around interrupts (actual REAL interrupts, not the software abstracted ones of today) versus syncing up with the code.
|
|
|
|
|
Interrupts - what a luxury. On my old box I do bit banged serial communication without a UART. Currently all is well at 19200 baud. 38400 works for single bytes, but not for larger memory blocks. The timing error in the delay loops obviously accumulates too much when too many bytes are sent at once. Maybe I can resynchronize at every start bit, but I would have to overclock the old 8 bit processor a little more. It could go faster than 8 MHz if I raised the processor's core voltage above 5V. Perhaps it would then even become noticably warm and the processor would actually require cooling.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Impressive! I never went below the UART interrupt level. Your mention of 8MHz reminded me of one of the things I tell my once a week Data Structures class that I teach - namely, that when I first started working with microprocessors the clock rate was in KHz, not GHz as they now are.
|
|
|
|
|
Fun times, sounds like. I never got that far back. I coded 6502 asm but Nintendo's hardware (NES) already had a pretty slick PPU that made timing somewhat easy.
Still coded in ARM32 asm for the DSi but that was mostly to do basic geometric transforms in the weird video memory it had (bitmaps were not stored in a straightforward manner).
|
|
|
|
|
The memory addressing logic was designed to let whatever graphics hardware you had access its video buffer quickly, not for the programmer's convenience. The hardware was racing the electron beam for you, so there was little time to waste. And then there is also the old problem of how to synchronize CPU and graphics hardware access to the same memory.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I'm surprised no-one's mentioned ZX80 and ZX81. They didn't have specialised circuitry to handle the display, but, instead, had the Z80 execute the contents of the screen, ensuring that the Z80 itself saw only NOP until the end of the line. The contents of the data bus (the actual character codes) were then fed to the character generator.
|
|
|
|
|
Z80! Haven't heard that for almost 40 years, I used to write assembler for that, too.
You know this really means we are all a bunch of old farts!
|
|
|
|
|