Why aren't other old periphrals better than modern ones like keyboards

Findecanor

15 Mar 2022, 20:04

CaesarAZealad wrote:
14 Mar 2022, 21:53
Idk man the xbox 360 controller has kinda cemented itself as the default controller in the minds of most people. I remember when sony plagiarized made a similar asymmetrical controller it really became apparent to me that that design ain't going anywhere.
Sony? You mean Nintendo with the Switch Pro controller being close in appearance?
By "asymmetric" you mean the layout of D-pad and left analogue stick? The Nintendo Gamecube controller and the original XBox had those in 2001, and both of those copied the Sega Dreamcast controller.
But I'd agree that most new gamepads have an enclosure shaped pretty much like the Xbox 360 controller's.

A proper arcade joystick still gives you better control than the D-pad on any of these. But, no, it does not do analogue.
Bjerrk wrote:
15 Mar 2022, 18:14
College students have plenty of time
Not true. The workload does vary a fair bit between colleges and over the year, but in general, a "full time" college student does study full time in total - it is just that he/she is able to schedule the time more freely than someone working 9-5.

User avatar
Bjerrk

15 Mar 2022, 21:45

Findecanor wrote:
15 Mar 2022, 20:04
Bjerrk wrote:
15 Mar 2022, 18:14
College students have plenty of time
Not true. The workload does vary a fair bit between colleges and over the year, but in general, a "full time" college student does study full time in total - it is just that he/she is able to schedule the time more freely than someone working 9-5.
I mean, I have a PhD in physics and am a university researcher, so I know how it works ;-) I still like to poke a bit of fun (lovingly) at students who imply that graduating will lead to increased free time :-) I was a student not that long ago.

In any case, my post was not serious - you did see that I suggested writing an entire operating system as an alternative to updating Windows? :-D

User avatar
CaesarAZealad

15 Mar 2022, 22:38

Findecanor wrote:
15 Mar 2022, 20:04
CaesarAZealad wrote:
14 Mar 2022, 21:53
Idk man the xbox 360 controller has kinda cemented itself as the default controller in the minds of most people. I remember when sony plagiarized made a similar asymmetrical controller it really became apparent to me that that design ain't going anywhere.
Sony? You mean Nintendo with the Switch Pro controller being close in appearance?
By "asymmetric" you mean the layout of D-pad and left analogue stick? The Nintendo Gamecube controller and the original XBox had those in 2001, and both of those copied the Sega Dreamcast controller.
But I'd agree that most new gamepads have an enclosure shaped pretty much like the Xbox 360 controller's.

A proper arcade joystick still gives you better control than the D-pad on any of these. But, no, it does not do analogue.
Nah the PS5 controller, though nintendo also did copy it. I just kinda push nintendo into the back of my mind because they suck. I mean Sony and Microsoft suck as well, but they make video game consoles, not toys for children and people with the minds of children (Furries, manchildren, etc.)
I've used a dreamcast controller, it's just that the OG Duke controller improved on that by not being completely awful XD
The 360 is when it was perfected in it's modern form. I picked up a series X controller recently and it's seen basically no change whatsoever from the Xbone controller, which was just a sleek and sexy version of the 360 controller.

AnnoyedWalrus

16 Mar 2022, 00:10

CaesarAZealad wrote:
15 Mar 2022, 22:38
Nah the PS5 controller, though nintendo also did copy it.
The PS5 controller isn't asymmetric, right? The "Dualsense" (stupid name) for the PS5 has basically the same layout as the "Dual Analog Controller" (great name) from 1997.
I just kinda push nintendo into the back of my mind because they suck. I mean Sony and Microsoft suck as well, but they make video game consoles, not toys for children and people with the minds of children (Furries, manchildren, etc.)
Aren't they all just toy companies making toys for children? There is nothing wrong with toys but lets not pretend that some toys are better than other toys. (The exception is of course the war between the Genesis/Megadrive and the SNES where the SNES was clearly superior.)
I've used a dreamcast controller, it's just that the OG Duke controller improved on that by not being completely awful XD
I am sure it is a matter of taste but I really preferred the Gamecube controller since it was a controller designed for human hands and not a controller designed from a PCB.

User avatar
CaesarAZealad

16 Mar 2022, 00:24

AnnoyedWalrus wrote:
16 Mar 2022, 00:10
CaesarAZealad wrote:
15 Mar 2022, 22:38
Nah the PS5 controller, though nintendo also did copy it.
The PS5 controller isn't asymmetric, right? The "Dualsense" (stupid name) for the PS5 has basically the same layout as the "Dual Analog Controller" (great name) from 1997.
I must be having a mandela effect moment because I remember it being asymmetrical, I even remember my brother complaining because he liked the PS style of symmetrical sticks. Oh well, my B.
Aren't they all just toy companies making toys for children?
Not really. Xbox and PS is a lot more broad with their audience (though xbox seems to cater to the more casual audience while Sony seems to cater to financial masochists :P )
Nintendo gave up making actual game consoles after the Gamecube.
I am sure it is a matter of taste but I really preferred the Gamecube controller since it was a controller designed for human hands and not a controller designed from a PCB.
Your right about it being a matter of taste, but personally I don't like any controller nintendo has made, even the gamecube controller. They all feel gimmicky. Plus the quality varies greatly from controller to controller. Though compared to the joycons or switch pro controller I'd take the GC in a heartbeat since it has some semblance of reliability.

User avatar
pyrelink

16 Mar 2022, 00:25

CaesarAZealad wrote:
15 Mar 2022, 22:38
I just kinda push nintendo into the back of my mind because they suck. I mean Sony and Microsoft suck as well, but they make video game consoles, not toys for children and people with the minds of children (Furries, manchildren, etc.)
What on earth are you talking about

User avatar
Muirium
µ

16 Mar 2022, 00:46

Indeed. Nintendo was always the best, and still most likely are. If you need to pump fragmentation grenades into zombies with a modified chainsaw, then get a PC already. You can use real controls on those!

User avatar
CaesarAZealad

16 Mar 2022, 01:04

Muirium wrote:
16 Mar 2022, 00:46
Indeed. Nintendo was always the best, and still most likely are. If you need to pump fragmentation grenades into zombies with a modified chainsaw, then get a PC already. You can use real controls on those!
Agree to disagree. The only thing I'd ever need a nintendo switch for is animal crossing, and let's just say I "Stapled my 3ds" to my pc ;)

User avatar
Polecat

16 Mar 2022, 03:25

Muirium wrote:
15 Mar 2022, 17:03
8 gigs is more RAM than my phone, just. More than my watch, too. But less than my real computers. ;)

I've been off Windows so long I've honestly no idea how old Windows 7 is. My last was XP. Are they on 11 or 12 now? Is it a new number every year, like every other OS these days?

My first machine was 32k, I think. It was my old man's but we kids were the only ones to ever use it. Similar story with our first PC. 512k, and ran GEM instead of Windows. So yeah I've a history as well, but still care about the figures, even on the new stuff. Software's job, evidently, is to ensure it still always matters! :lol:
Win7 came after XP. No longer supported, but of course the pirate warnings still keep coming. I (grudgingly) built a Win95 box when I first got online, used that until the browsers no longer worked, at which point I was given a Mac Mini. Used that until I couldn't upgrade it to keep working, so I bought the Win7 box. Honestly I don't have a clue what's inside. I do all my offline stuff on ancient DOS computers.

I feel sorry for anyone who thinks they need a smartphone as part of their support system. I'm not willing to go there; that's one of the few advantages of being old.

User avatar
Erderm_

16 Mar 2022, 04:55

With monitors though, it's a bit tougher for me to put into words. I think, again, there aren't too many ways to do a monitor like a keyboard. CRT was pretty much the standard for the big boxy monitors of yesteryear and the last major advancement to change the standard was when LCD displays were introduced. At least from what I can gather.
I can confidently say there are loads of people who nerd out the same as we do for keyboards, but for CRTS. Saying all CRTs are the same is about the same as saying all vintage keyboards are just beige wedges. I hope my saying this is the beginning of your addiction to Trinitrons :lol:

User avatar
Bjerrk

16 Mar 2022, 08:58

I think, again, there aren't too many ways to do a keyboard like a monitor. Full-travel was pretty much the standard for the big chunky keyboards of yesteryear and the last major advancement to change the standard was when flat profile keys were introduced. At least from what I can gather
;)

I guess monitors are much like hi-fi, in a sense. Some people swear that old CGA/EGA/whathaveyou graphics doesn't look right on a modern monitor. Some people insist that Frank Zappa should be played on vinyl :)

User avatar
Muirium
µ

16 Mar 2022, 09:35

When is someone going to make a wrist mounted vinyl deck for playing music on the go? That’ll be so boss! When you tire of how utterly useless it is, you can add it to your necklace. Sick!

User avatar
Yasu0

16 Mar 2022, 17:56

I'm in, where do I put down for the pre-order.

User avatar
CaesarAZealad

16 Mar 2022, 19:52

Muirium wrote:
16 Mar 2022, 09:35
When is someone going to make a wrist mounted vinyl deck for playing music on the go? That’ll be so boss! When you tire of how utterly useless it is, you can add it to your necklace. Sick!
All I can think of is those duel gauntlets from Yu-Gi-Oh

User avatar
Elrick

31 Mar 2022, 03:57

Muirium wrote:
15 Mar 2022, 17:03
I've been off Windows so long I've honestly no idea how old Windows 7 is. My last was XP. Are they on 11 or 12 now? Is it a new number every year, like every other OS these days?
My second PC was an Apple 7200 which used OS 7. The very best version of an Apple OS that was ever made. Efficient and, if you know how to configure it, was 100% reliable.

That was back when Apple actually made an OS that worked and was freely available for anyone, that wanted to try out their software.

When their OS 10 junk came along, I switched on over towards Microsoft. Still here, many decades later :( .

Findecanor

31 Mar 2022, 10:06

Elrick wrote:
31 Mar 2022, 03:57
My second PC was an Apple 7200 which used OS 7.
It's called "System 7", not "OS 7". Apple didn't start calling it "Mac OS <number>" until Mac OS 8, in preparation for trolling Microware with the next version.

User avatar
Muirium
µ

31 Mar 2022, 11:41

Findecanor wrote:
31 Mar 2022, 10:06
It's called "System 7", not "OS 7". Apple didn't start calling it "Mac OS <number>" until Mac OS 8, in preparation for trolling Microware with the next version.
Actually, Apple rebranded it to "Mac OS" for version 7.6. System 7.5.x was the end of the road for the awkwardly unnamed operating system.
Users will also notice that references to Macintosh are being changed to Mac OS, and the familiar About this Macintosh item in the Finder now reads About this Computer. Similarly, the much-loved Welcome to Macintosh display that appears when a machine first starts up has been suppressed in favor of a more modern (and more generic) Mac OS logo.
Why is this information even lodged in my head? That's what I want to know! :geek: :lol:

User avatar
hellothere

11 Apr 2022, 02:26

Muirium wrote:
31 Mar 2022, 11:41
Users will also notice that references to Macintosh are being changed to Mac OS, and the familiar About this Macintosh item in the Finder now reads About this Computer. Similarly, the much-loved Welcome to Macintosh display that appears when a machine first starts up has been suppressed in favor of a more modern (and more generic) Mac OS logo.
https://en.wikipedia.org/wiki/Motorola_StarMax

User avatar
LambdaCore

11 Apr 2022, 02:55

It's because input devices in most cases can't really evolve once it's standardized, the moment the IBM Model M became the standard was the moment the ANSI keyboard was set in stone thus the main difference is ironically what we've done to keep costs down to beef up the computer itself, with membranes growing in popularity in the 90s - in the same way controllers went through something similar, after the Dualshock and the Original Xbox S controller, you've had two base layouts for the most part with little innovation after the fact.

Ergonomic keyboards, other layouts, etc... these very well could objectively be better, I couldn't tell you for sure, but they face an uphill battle because we're so familiar with the classic QWERTY rectangle, many of us using them since we were mere children.

User avatar
Muirium
µ

11 Apr 2022, 10:05

The Wii never existed. Noted. :ugeek:

To be fair, it was really hard to get one for a while there when everybody pounced on them.

User avatar
LambdaCore

11 Apr 2022, 12:55

I actually think the Wii is a great example of this, it was a popular trend for a few years, but ultimately didn't have a huge impact because players across the board preferred standard controllers

DrunkUkrainian

22 Apr 2022, 15:41

I think most peripherals can be better or worse whereas keyboards are different from each other. Mice, for example, have more accurate ways to sense movement. Keyboards aren't getting better or worse, they're changing. Some people prefer the feel of beam or buckling springs, others like the sound of Alps, others value the customizability and modern features of Cherry MX boards. There's just a lot more room for preference. Do I wish my Northgate Omnikey had a native USB plug and NKRO? Of course, but I'm willing to make that sacrifice in order to experience something that can't be rivaled by newer devices.

At the same time, there are people who still swear by CRT monitors and good headphones stay good, so there are still people in other niches who will look to older stuff instead of newer stuff. The sentiment that "twas better in the older days" is a really common one.

User avatar
Muirium
µ

22 Apr 2022, 16:03

Cost cutting / race to the bottom is a widespread phenomenon, especially once production goes to China. (Why were “we” so foolish to outsource all our stuff? To a closed country with an ever less benign totalitarian dictatorship?) Jackets, shoes, bags, books, glasses, all manner of consumer goods have absolutely deteriorated in production quality and been redesigned for ever lower costs. It is very real. Modern stuff is mostly shite.

This ubiquitous dynamic in consumer goods is rare, though, when it comes to silicon. Computers are in a league of their own in that respect. Can you just imagine if 1980s CPUs outperformed modern hardware! Arguably software would have to be better coded, as wringing every bit of performance would be just as important again. But I’ll take my M1 over a Motorola 68020 thank you very much. And my OLED phone display. :D

If only the same were true for other things.

davkol

23 Apr 2022, 00:31

hellothere wrote:
15 Mar 2022, 00:51
The "scaling" on my Mac is really "use a different resolution." And all the displays connected to my Mac, even the 165hz one, look pretty terrible at resolutions they weren't built for.
:roll:

That's like the people who watch TV at the wrong aspect ratio, i.e., cropped, or with widened faces (or both)…

It's wrong. :evil:
hellothere wrote:
15 Mar 2022, 00:51
You can scale all objects and fonts in Linux without going to a different resolution.
Not really. The X server used to support any DPI setting… until it stopped in practice. That's separate from screen scaling though, and even that depends on the UI toolkit. At this point, GNOME and to a certain extent Plasma on Wayland work fine at 200% scaling for the most part and fractional scaling is kinda there too… as long as you don't run XWayland, ancient Java applications etc.

As I've already noted somewhere, the only truly reliable approach is Apple's: their officially supported screens and 200% scaling in hardware. They don't account for any other scenario anymore. See: various posts related to SwitchResX—and this, for example: Fundamental Screen Resolution Issues with macOS 11 (Big Sur) on Apple Silicon - Apple needs to act on that

Everything else is an unforeseen mess with some applications simply tiny, blurry or broken.

User avatar
Muirium
µ

23 Apr 2022, 09:34

I was about to quibble with your point comparing non native 2× Retina to the internationally recognised war crime that is Wrong Aspect Ratio, until I saw the bit about 165 Hz. That display will be 100 DPI for sure, and therefore yes a travesty to look at in any other resolution. Low DPI displays should only ever be run at native 1×. They are poor enough doing that already, don’t push them!

Now, I absolutely do use all my displays (4k 24 inch desktop and M1 MacBook Air) at scaled resolutions. These displays are high DPI, however, and because the pixels aren’t individually visible unless you smack your face into them, there’s no practical reason not to use whatever software resolution suits you best. I find both screens very awkward to use on Apple’s “Default for this display” setting: they’re both cramped with oversized graphics and text I need to shrink to comfortably read. Fortunately, there’s a single button fix for all that. High DPI means you’re untied from the arbitrary resolution of your display. It’s just down to you now and your eyes. ;)

User avatar
hellothere

23 Apr 2022, 16:54

Muirium wrote:
23 Apr 2022, 09:34
I was about to quibble with your point comparing non native 2× Retina to the internationally recognised war crime that is Wrong Aspect Ratio, until I saw the bit about 165 Hz. That display will be 100 DPI for sure, and therefore yes a travesty to look at in any other resolution. Low DPI displays should only ever be run at native 1×. They are poor enough doing that already, don’t push them!
Did you mean to make the acronym "WAR"? Even if that was unintentional, you got a chuckle from me :D.
davkol wrote:
23 Apr 2022, 00:31
hellothere wrote:
15 Mar 2022, 00:51
The "scaling" on my Mac is really "use a different resolution." And all the displays connected to my Mac, even the 165hz one, look pretty terrible at resolutions they weren't built for.
:roll:

That's like the people who watch TV at the wrong aspect ratio, i.e., cropped, or with widened faces (or both)…

It's wrong. :evil:
Hey. This isn't the Alps lube thread! :P

I've got two computers. On my MX Linux box, all three monitors are "2K" resolution. All of them do 60hz or more. They range in size from 27" to 32". I'm using Plasma at 1.5 scaling and there's no problem. I'm currently typing this on my Windows box, which is currently connected to a 43" 4K 60hz TV that supports HDR. I'm using 300% scaling without a problem.

Are we all talking about the same thing?

One of the above posts led to a page that linked to https://github.com/waydabber/BetterDummy. I might give that a shot.

davkol

23 Apr 2022, 17:14

Muirium wrote:
23 Apr 2022, 09:34
These displays are high DPI, however, and because the pixels aren’t individually visible unless you smack your face into them, there’s no practical reason not to use whatever software resolution suits you best.
Ugh.

That's empirically wrong. Last night I realized there was a 43" UHD TV downstairs that I could test. Of course running it at FHD produced nasty jagged fonts (even from a distance) whereas UHD w/ 200% scaling was nice and smooth.

Why?

It requires a basic understanding of how computer graphics work and what these terms actually mean.

Firstly, Retina display, despite being an advertising buzzword, has a quite precise definition: it's a function of physical pixel density *and* viewer's distance from the screen, hence the "pixels per degree" unit of measurement. Any display can be "Retina" if you're far enough.

With that out of the way, modern software usually doesn't work with pixels directly. It never has on desktop in case of fonts (and some other UI elements), hence the need to translate DPI into PPI. And that's the core issue: different UI toolkits do some of it differently. (Although Apple does have more control over their GUI libraries and tries to enforce their usage by third parties.)

In case of resolution scaling, the graphics stack still works with the full resolution internally and only applies the scaling on output (but when the scaling is fractional, the fractions may not always come out nice). After all, that's the simplest anti-aliasing technique: use more samples. However, if you force a lower screen resolution at hardware level, you get none of this, and leave the final translation to physical pixels to the monitor's firmware (typically coded by monkeys on crack).

(source: I had to sit through a series of seminars on graphics about a decade ago.)

Post Reply

Return to “Keyboards”