I’ve released a new version of the Amiga core (svn.fpgaarcade.com).
New features :
48M XRAM mapped as Chip and all chipset DMA pointers support 64M addressing (optionally enabled in OSD / .ini file)
CPU cache stable – 2 way I + D with full snoop. ~16000 AIBB Dhrystones
Filtered & Mono audio
RTG over analog/DVI/HDMI with hardware sprites and dedicated blitter engine. 1024×768 at 16 bit, 1280×1024 at 8 bit.
(1920x1080i works over analog but not digital for some reason at the moment)
Real Amiga keyboard support (thanks to Erique)
I’m going to spend some time now updating the website and working on the AGA debug hardware so we can further improve compatibility. I also have some CPU upgrades in the pipeline which will allow x2-x3 speed up hopefully.
A lot of work has been done both on the CPU to increase compatibility, and the core to get the cache working and timing closure.
The design is completely constrained and this results in a very stable platform
2K Instruction plus 2K Data cache with full chipset snoop on both.Performance is around 11.3 mips / 10800 dhrystones with sysinfo 4.0.
The RTG dedicated blitter is working, and this coupled with the fast hard disk speed makes the system feel highly responsive.
I’ve added some modes to the scan doubler to help DVI monitors work.
Low pass filters on the audio (optional).
HRTMon cart working again with custom & cia shadow and full VBR support. Config is editable from the .INI file.
More details on the forum.
The AGA core is pretty stable now, but one issue which has been haunting me for ages is some blitter operations go wrong and leave garbage on the screen.
I’ve been working with Jim Drew, who pointed me at some source code released by DMA designs for Menace – Amiga Format issue 8 or 9 I think : AmigaFormatIssue009
This was a fairly simple sub-section of the game which showed how the parallax scroll worked. It also nicely exhibited the problem with my core. When you are debugging complex hardware, usually we use a simulator to see what’s going on. It takes several minutes to simulate one frame of video (on bigger projects it can take hours) so it really helps to get as simple a test case as possible. The testbench I use can load in SREC files from the cross assembler directly into the DRAM in the sim, so I can run exactly the same software as on the hardware.
To make sure the OS doesn’t get in the way, I upload a simple bootstrap binary to 0xF80000 (where the Kick ROM would live) which jumps to 0x20000 in chip men – where I have also uploaded my test program and any resource files needed. The files are in the forum here
I took the Menace code, and kept chopping bits out, until I had the simplest possible case where the problem was still visible. Just blitting one tile over and over showed random corruption. I then saw in the simulator that the next blit was being started by the CPU before the previous one had finished. Odd – so how could this have worked on real hardware?
The author, David Jones, had a clever trick. DMA “nasty” was enabled, which stalled the CPU until the blit had finished, then the CPU would set up the next one.The problem with the core was we have 16 times the memory access slots as the original A500, and despite my best efforts the CPU was still getting in. I’ve just updated the core now which seems to resolve the problem. When in A500 or A1200 compatibility mode, the Gary module holds off any CPU access to chipset resources until a colour clock cycle (roughly 3MHz). If a collision with a request from the DMA controller happens, the CPU is stalled to the next colour clock cycle – exactly as in the real machine. I also tweaked the soft CPU so it will only start cycles every 4 cycles (A500) or 2 clocks (A1200) mode.
When the system is put into “turbo” mode, these constraints are turned off and everything goes at full speed. As a result, Menace, Slamtilit AGA and Shadow of the Beast seem to work fine, although more testing to follow. I need to do a bit more tidying up, then back to the DVI / RTG (enhanced graphics card) work.