Ace of Spades graphics, bad, simple or "retro"?
-
AlMualim
League Participant
- Posts: 47
- Joined: Thu Nov 22, 2012 9:25 pm
It's simple, you can't expect good graphics from a game where you can dig and build blocks really fast, it have to be a really bad graphich = blocks. Ace of Spades is 16 bits, 8 bits are Atari games like the first mario or Terraria
Build and Shoot Signature Generator

[02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:43] <AlMualim> [02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:44] <AlMualim> IRCCEPTION
-
Reki
Deuced Up - Posts: 106
- Joined: Fri Nov 09, 2012 11:07 pm
>implying these choices are mutually exclusive
I am a single bullet.
It has no heart.
Therefore, it does not think.
It just flies straight towards its target.

-
Sonarpulse
Coder
- Posts: 443
- Joined: Thu Dec 13, 2012 7:18 pm
AlMualim wrote:ce of Spades is 16 bitsYou are unintentionally correct as simple.cpp in voxlap contains the one and only reference to "DOSMAIN/WINMAIN". Voxlap may have well once run on DOS.
-
Skwid
Deuce - Posts: 12
- Joined: Sun Nov 18, 2012 11:53 pm
The graphics were bad way back when the viewmodels were made of monochrome rectangles and the rifle would turn black when you used the ironsights. Now, with improved particle effects, shell casings, and better-looking blocks, I'd say AoS graphics are right on the border of decent-ish.
-
AlMualim
League Participant
- Posts: 47
- Joined: Thu Nov 22, 2012 9:25 pm
Skwid wrote:The graphics were bad way back when the viewmodels were made of monochrome rectangles and the rifle would turn black when you used the ironsights. Now, with improved particle effects, shell casings, and better-looking blocks, I'd say AoS graphics are right on the border of decent-ish.About your signature.. BOOM HEADSHOT!
Build and Shoot Signature Generator

[02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:43] <AlMualim> [02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:44] <AlMualim> IRCCEPTION
-
tallyyyyyy
Winter Celebration 2013
- Posts: 180
- Joined: Sat Nov 10, 2012 1:04 am
AlMualim wrote:It's simple, you can't expect good graphics from a game where you can dig and build blocks really fast, it have to be a really bad graphich = blocks. Ace of Spades is 16 bits, 8 bits are Atari games like the first mario or TerrariaTerraria is NOT 8-bit.
Compare:


Get what I'm saying?
THE BEST WORLD' S RAP-MUSIC !
-
AlMualim
League Participant
- Posts: 47
- Joined: Thu Nov 22, 2012 9:25 pm
tallyyyyyy wrote:Yeah, butAlMualim wrote:It's simple, you can't expect good graphics from a game where you can dig and build blocks really fast, it have to be a really bad graphich = blocks. Ace of Spades is 16 bits, 8 bits are Atari games like the first mario or TerrariaTerraria is NOT 8-bit.
Compare:
Get what I'm saying?
what happened in terraria was: they added some shadows, and some little colors to the blocks and call it better graphics
Build and Shoot Signature Generator

[02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:43] <AlMualim> [02:43] <AlMualim> [02:42] <AlMualim> i made 4 people talk.
[02:44] <AlMualim> IRCCEPTION
-
tallyyyyyy
Winter Celebration 2013
- Posts: 180
- Joined: Sat Nov 10, 2012 1:04 am
8-bit didn't have the technologies to create shadows like that, draw so many sprites at once, have that good resolution, and much more. Even without the shadows, Terraria has faaaaaaar superior graphics. It's really undeniable.
THE BEST WORLD' S RAP-MUSIC !
-
HJK148
Deuce - Posts: 12
- Joined: Thu Dec 13, 2012 6:35 pm
Sponge wrote:Which Ace of Spades?The good one.
-
Demo123
Blue Master Race
- Posts: 495
- Joined: Mon Nov 05, 2012 3:03 pm
AlMualim wrote:Yeah, butTo me Terraria looks like an average looking(graphical vise) SNES game that can't run on an Radeon 9250, with a 1.4 shader model support, 1.6 GHz and 1GB of RAM.
what happened in terraria was: they added some shadows, and some little colors to the blocks and call it better graphics
-
rakiru
Coder
- Posts: 1349
- Joined: Sun Nov 11, 2012 12:26 pm
-
If you lot keep calling various things 8/16-bit just because they look vaguely similar to games that were around at the time when games consoles were using 8/16-bit processors, I'm going to fucking murderise you with your own damn limbs.
-
GreaseMonkey
Coder
- Posts: 733
- Joined: Tue Oct 30, 2012 11:07 pm
rakiru wrote:If you lot keep calling various things 8/16-bit just because they look vaguely similar to games that were around at the time when games consoles were using 8/16-bit processors, I'm going to fucking murderise you with your own damn limbs.Ace of Spades is 32-bit. If you think it's 8-bit or 16-bit, you're wrong. If you have any problems with this, there's the door, now walk through it.
Personally I believe the whole "8/16/32-bit" thing is a load of bullshit. Does it refer to the accumulator/main GPR size, the CPU external data bus width, the internal CPU data bus width, the mainboard data bus width, or simply what some moron in marketing decided it was going to be? I swear it's the latter.
And the reason why you are wrong if you think it's not at least 32-bit is this: it's for the AMD Athlon (or whatever first supported 3DNow) or Pentium 3 (SSE), and the successors to those things. These are based on the 80386 architecture, which is an extension of the 8086 architecture.
The 80386 architecture, once you get out of the 8086 compatibility mode, offers 32-bit wide general purpose registers / accumulator, I think a 32-bit wide internal data bus, at least a 32-bit wide external data bus (I believe it's 64 bits for the original Pentium), and your mileage may vary with respect to the mainboard data bus width (ISA tends to be 8 or 16 or something, PCI tends to be 32, I think AGP can go up to 64, not sure what VESA local bus goes up to).
By the time Pentiums were starting to get popular, I'm pretty sure they used PCI for the main stuff, and maybe some ISA slots for backwards compatibility (I have a computer somewhere which has a 266MHz Celeron (original cheapass version of the Pentium), 3 PCI slots, 3 ISA slots, and an AGP slot), so there's virtually nothing you can use to say it's 16-bit let alone 8-bit, except for marketing bullshit.
-----
The NES had a slightly retarded 6502 clone in it, although no way near as batshit retarded as the 8080 clone in the Game Boy. (e.g. If hl is in the $FE08-$FEFF range and you do INC hl or DEC hl, it corrupts sprite information. I have actually fucked myself over with this bug unwittingly and that was just doing arithmetic.). The Sega Master System had a Z80 in it, which is an extended clone of the 8080. This is the 3rd generation of consoles, and sorry Sony but generations do not map to Playstation version numbers, so piss off.
Older stuff that actually uses a CPU rather than a lot of transistor shit is referred to as the 2nd generation. The Atari 2600 and 5200 are 2nd gen (the 7800 is 3rd gen), and they use a 6502 as their main CPU. The Colecovision used a Z80. Most of the other stuff used weird CPUs nobody really cares about these days, such as the Fairchild F8 in the Channel-F and some General Instruments 16-bit CPU in the Intellivision (there goes that notion of there being distinct 8-bit / 16-bit eras).
The definite end of the 2nd gen is the video game crash of 1983, and the main difference between 3rd and 4th gen is that 4th gen was basically 3rd gen on faster, more flexible hardware. The 2nd gen, from my observation, is more varied than the 3rd and 4th gens put together. The 5th gen was where 3D started to take off (Playstation, Nintendo 64, Sega Saturn, Atari Jaguar, 3DO, probably missed something there), but this is occasionally referred to as the 32/64-bit era.
The Sega Mega Drive had a 68000, which depending on who you talk to is either a 16-bit CPU or a 32-bit CPU. The SNES on the other hand had a 65816 (an extended 6502 which has a 16-bit accumulator and a 24-bit address bus), which depending on who you talk to is either a 16-bit CPU or an 8-bit CPU.
-----
I've done some 68000 programming and I can tell you that it's nicer than even the 32-bit Intel CPUs (well, aside from the fact that the assembly syntax is specially crafted to make you shoot yourself in the foot over and over again), so if I had to give a "-bit" to it, I'd regard it as 32-bit - after all, the 8088, an 8086 with an 8-bit wide external data bus, is still regarded as 16-bit, and the later 680x0s which are undebatably regarded as 32-bit aren't all that different from the original 68000 (and can run without needing some "compatibility mode").
I don't know much about the 65816, so I'll leave that to someone who's actually programmed for it.
I like the Z80, except when I have to write an emulator for it, where I end up just hating the bloody thing. I would also choose coding for the 68000 above coding for the Z80 given the choice, but I hear that's even worse to emulate. But yeah, the Z80 has an 8-bit accumulator, and can use 5 different pairs of 8-bit registers as 16-bit registers (this includes IX/IY, and excludes the shadow registers and that bloody internal register unofficially named WZ).
I believe it internally has an 8-bit data bus, even for most of the 16-bit operations, but this does raise an interesting question: is this 8-bit or 16-bit? What if we add in an internal 16-bit data bus? What about the 16-bit HL register, which could be argued to be an accumulator? People tend to regard this CPU as 8-bit, but one could form a plausible case for this being 16-bit.
When I first encountered the 6502, I didn't believe people could do anything serious with it, but it's actually a bloody awesome CPU, and oddly enough is faster per-clock than the Z80 (well, that's because the Z80 uses a minimum of 3 cycles to do a memory fetch). It's also a pretty simple CPU. It has an 8-bit accumulator, 2 8-bit index registers, a 16-bit program counter, a 16-bit stack pointer, and an 8-bit flags register. I think it also has an internal 16-bit register for subroutine jumps and whatnot.
Internally, it has an 8-bit data bus, and externally, the same deal. Now it's pretty hard to not regard this as an 8-bit CPU, unless you're in marketing. But why is it that a Hudson Soft extended version of it, which is also "8-bit" to the core, is used in a system marketted as 16-bit? (PC-Engine / TurboGrafix-16) The 16-bit thing boils down to the graphics chip in it having a 16-bit data bus, but that's like referring to my desktop as a 64-bit system just because it has a 64-bit graphics card in it (it's got a single-core 32-bit AMD Athlon XP in it). I call bullshit on this one.
Saying 8-bit or 16-bit refers to the type of graphics and/or sound is just asking for a beating. Making "8-bit" music which can't even come close to playing on any of the so-called "8-bit" systems is asking for a truck to be dropped on you, and you should feel very bad for even thinking of doing so, because you are a disgrace to music as well as game consoles.
TL;DR, learn to read you lazy kid. Go on. Scroll up and read it. Hell, there's usually a printable version so you can print it off and show it to your teacher. You could even write an essay about this rant and actually get marks for it.
-----
And finally, this is roughly how you make shit for an actual "8-bit" console. (As in, don't just pussy out and try to make it sound like a shittily-composed score for a game console. Yes, you CAN make a NES sound like that. No, I did not use any audio expansions.)
-
Xpert658
Build and Shoot's 1st Birthday
- Posts: 321
- Joined: Sat Nov 10, 2012 8:38 am
Yeah I say AoS is retro, but for some reason, I GET 40 FPS ON 1024x768 WTH.
IGN: ForTheHype
RoTAdolf_Hitler wrote:Jdrew is a gay jewSkype: Mr. Dr.
Who is online
Users browsing this forum: No registered users and 15 guests






