pouët.net

How does PS1 CPU compare with old PC hardware ?

category: general [glöplog]
As a former PSX programmer (Time Command, Little Big Adventure, The Smurfs, V-Rally 2), I have to say that it was far to be the worse machine I worked on. If I compared the development kit (hardware and software) to equivalent of the time (Jaguar, Saturn) it was much better.

Now performance wise, the beast is well balanced, so it's quite easy to get the hang on on it all work. I would say that the important points to consider are:

  • the GTE (Geometric Transformation Engine) unfortunately has 16 bit translations which makes large worlds more complicated to do that it should.
  • the R3000 is not particularly fast, but it's the type of CPU that has quite many generic registers so it's easy to do assembler for it.
  • The biggest issue is the cache management which is directly mapped to the memory modulo the size (4k I think), which means if you are unlucky you can write code which is always out of cache because the subfunction you are calling is at the same modulo address as the calling code (a map file helps with that).
  • The 1k scratch pad (zero latency memory) is the secret of performance.
  • Hand made OT list management and DMA transfer is what gives the final edge.

Both Time Commando and Little Big Adventure were targeted for 486 DX/DX2 type hardware, and it was a non issue getting the code run on the PSX (the issue was the memory usage).
added on the 2014-02-11 17:47:57 by Dbug Dbug
Quote:
Both Time Commando and Little Big Adventure were targeted for 486 DX/DX2 type hardware, and it was a non issue getting the code run on the PSX (the issue was the memory usage).


Thx for the answer, Dbug!
Btw, how did the 3D games from the later era were ported then? NFS High Stakes (1999) for example? The original game required P200 w\32MB RAM minimum. I know that TNFS:SE from 1996, which ran sufficiently on 486 in 320x200 was ported on PSX somehow, but in NFSHS on PSX the visuals tend to look much better than those in TNFS:SE. How can that be?
added on the 2019-09-24 22:27:27 by redigger redigger
The Playstation1 had hardware accelerated texture mapping and special extensions ("Geometry Transformation Engine") for fast matrix*vector calculations.
So the main CPU was only used to set up the rendering.
PC games of the time were usually doing software rendering (at least as a fallback for non-Voodoo users).
added on the 2019-09-25 10:26:32 by hfr hfr
Ah, "redigger". I get it.
added on the 2019-09-25 10:27:52 by hfr hfr
Also, gamedevs are simply pushing the consoles to the edge as they target the same hardware for lot's of years to come. I am gaming on PC, but at times I am fascinated when I see some of the tiles of the late era of consoles for example in PS2 or PS3. If you look at the hardware specs of those, especially the low memory, then look at some latest titles, it blows my mind how they made them fit. If I came and described the specs of a console and then I told you I'll I am gonna port game A, you will laugh at my face.

Case in point I remember (and certainly there are other titles), this openworld racing I loved playing Test Drive Unlimited, and it was a Pentium 4 era game with at least 1GB ram, and they did a port on PS2. Then I checked the specs, it has 32(!!!)MB of Ram and I bet the CPU/GPU is something of a previous gen than the PC I needed to run TDU ok. I had 32MB on a Pentium 1 PC. And it's running the openworld TDU and it runs great! Also latest PS3 games would trick me to think it's a modern PC title (regardless the crazies that put comparison video and check for little details nobody notices), I would try Last of Us or Heavy Rain or Uncharted on PS3 and would not think for once that the graphics are last gen (Unless you check for nitpicky details). One part of the optimization though would be to reduce assets but in such way that is not as noticable.
added on the 2019-09-25 18:47:11 by Optimus Optimus
There are tons of reasons, but one is that the tradeoffs are simply different. On a console, shaving off 1MB of RAM use can mean the difference between shipping or not (and thus be worth spending a month of developer time on); on PC, it might mean some low-end PCs drop a bit in framerate when swapping now or then. There's something to be said for what's possible if you _have to_; when your target is fixed, your choices are between coming up with exceedingly clever optimizations (including cutting assets in creative ways) and not shipping.

Related article: https://www.gamasutra.com/view/feature/131556/postmortem_angel_studios_.php
added on the 2019-09-25 22:14:29 by Sesse Sesse
Carmack's reflection on the Doom port to PSX is also a nice read of how the PSX hardware completely mismatched actual technical needs for realtime 3D graphics at that time (particularly the horrid affine texture mapping), so basically he had to tessellate the shit out of the 3D which was actually doable considering the hardware was there for it. i believe it's documented in that free Doom engine book by that Fabien Sanglard guy
added on the 2019-09-26 15:03:47 by maali maali
@sim:
reading this 5 years later:
your assumptions were correct, good reading-ups beforehand! :P

you could have read my pouet-account-page aswell, tho....to not disregard me as someone only talking bull! ;) (i am into coding for 35+ years by now...reading specs makes me decide if sth is or is not worth it...as seen in PS...wasn´t successful at all, right?! ;) HAHA!)

Seriously: if you really want to make sth really good on that shitty machine you need to dive into cache-misses and how to abuse them for some years first! :P HAHAHA! -> PS1 suxx! -> then earn BIG cash afterwards on any other machine! ;) cash on cache! )
Quote:
PS1 suxx


disagree. PSX was a pretty elegant machine for its time. and still is. very clever combination of off-the-shelf RISC CPU, custom hardware and smart engineering (ordering tables). very cost effective solution also.

ist es zu hart bist du zu schwach :)
added on the 2019-09-27 10:53:02 by arm1n arm1n
agree with spike. PSX was not so bad, especially if you compare what the others did :

- Saturn : weak CPU (that's why they put two inside to compete with PS1), hard to program, no real transparency, quad-based rendering instead of triangle-based (designed for the japan market, where RPG and 2D fighting games were predominant).

- N64 : no CD-ROM (use cartridge), texture caching issues that prevent using full potential.
added on the 2019-09-30 10:05:13 by Tigrou Tigrou
Quote:
The Playstation1 had hardware accelerated texture mapping and special extensions ("Geometry Transformation Engine") for fast matrix*vector calculations.
So the main CPU was only used to set up the rendering.
PC games of the time were usually doing software rendering (at least as a fallback for non-Voodoo users).


Ah well, now I finally get it. Y'know, I am a PC owner almost all my life, and I used to have some clear prejudices about consoles. I got my first 3D acc. TNT2 in 2000 and since then I used to think that every piece of hardware that outputs blocky textures is thus does it all in software. Hence I always wondered how PSX did it all with 33 MHz. A few years ago I discovered that PSX had a hard. acc., and now I finally get it that it it was (in general) like a Voodoo+CPU setup, but it was slower and in a somewhat different form factor... Thank you!

Quote:
Ah, "redigger". I get it.


You get what?))) Someone told me that it sounds like "gravedigger", but actually it is after Deep Purple instrumental "Coronarias Redig"))) But it's ok if it was your thought))) And, well, I do really like to "dig" everything I want from the past, mostly because I wish to reconstruct these times when I was a child, and there was a whole world of people and things that I had no idea about at the time (mostly 90s).
added on the 2019-10-02 00:17:18 by redigger redigger
to Optimus:

Yes, I think the same nowadays. The common graphics quality is so advanced now (and for the past 8 years or so), that if you won't look for some fine detail, it all looks spectacular and realistic.
added on the 2019-10-02 00:20:12 by redigger redigger
to Sesse:

I have read your link. And I even got my hands on RE2 for N64 just to see those cinematics on cartridge! These guys surely did a great job. And it's a pity that despite having a way better hardware than PSX, N64 was crippled with storage limitations so devs had to be really resourceful.
added on the 2019-10-02 00:49:05 by redigger redigger
to Maali:

I have read the stuff about the Doom ports and the Carmack's tweet about the thing. It seems like PSX never overcame its memory limitations, 'cause the simplified Doom maps from the Jaguar port were reused in PSX... And while I understand that coloured lighting in PSX port is so lovely to many people, I am adamant about the superiority of the original PC Doom.

Love that Sanglard guy. It was really pleasant to me to discover a professional digging through all that gaming stuff that was still reviewed by amateurs\enthusiasts, and not by professional programmers (IMHO).
added on the 2019-10-02 00:55:08 by redigger redigger
Quote:
@sim:
Seriously: if you really want to make sth really good on that shitty machine you need to dive into cache-misses and how to abuse them for some years first! :P HAHAHA! -> PS1 suxx! -> then earn BIG cash afterwards on any other machine! ;) cash on cache! )


Sounds really discouraging to me)))
added on the 2019-10-02 00:56:01 by redigger redigger
Quote:
Quote:
PS1 suxx

very cost effective solution also.


Yeah, that's the thing. Surely Sony did it all for money and a market share first, so it HAD to be cost effective. That's business...
added on the 2019-10-02 00:59:00 by redigger redigger
Quote:
texture caching issues that prevent using full potential.


That what's really appaling about N64. It had 4 KB texture cache if I am correct! Wonder if it had something to do with bandwidth to make the thing cheaper to produce...
added on the 2019-10-02 01:01:46 by redigger redigger
PSX Doom, yes it not a full polygon per wall segment, but it's not even subdividing the surfaces the ways other polygon based game would do. It really seems to end with the same rendering elements as the PC, possibly following the same pipeline even though I don't have the code to confirm (from BSP to columns, from there to visplanes, etc), vertical columns for walls, horizontal spans for floors, of course it's two triangles per column, not sure if they are 1 or 2 pixel wide. But if you run an emulator that can display the wireframe and try that on Doom, you'll see, it's a dense mesh of very thin vertical and horizontal series of columns/spans, made of triangles.

I'd think previously, that it was using it's own engine (which could still be massively rewritten, I don't know) and would imagine it's real bigger polygons or at least frequently subdivided (e.g. every 8 pixels) but it also seems like a mimic of the original PC rendering elements, just like the rest of the consoles (some of which could possibly use software rendering, but the 3DO uses the CEL quad rasterizer hardware, but for many 1 pixel thin stretched columns/spans). Also, the playstation will generate more triangles, when there is texture repeat, so a horizontal span for the floor from one side to the other side of the screen, might have to be split 2-3-4 or more times, so you'd have 4 * 2 triangles. It's quite dense, surprised the hardware can pull so many thin triangles. I heard it has to do with the fact that texture coordinates repeat does not exist on PS1 (it might clamp or do something else).
added on the 2019-10-03 20:28:09 by Optimus Optimus
https://fabiensanglard.net/doom_psx/index.html

That's a nice write up. It seems that they scrapped the visplanes algorithm (which is a big CPU bottleneck n 3DO Doom) for something else.
added on the 2020-06-02 22:37:21 by Optimus Optimus
Great page, thanks for the link Optimus!
added on the 2020-06-04 21:33:28 by rutra80 rutra80

login