If you were designing new -- how about a 120% or a 150% keyboard?

Morituri

17 Jun 2016, 19:19

It is true that the USB standard does not allow keyboards to do this by the usual means.

But every major operating system keyboard driver now uses the same means to enter extended Unicode characters (or at least can enable it), and the keyboard I propose can leverage that. Once it's decided what character you intend to send, if it's part of the normal keyboard mapping it can just do the normal thing. Otherwise it can send successive packets with - no keys pressed, - right alt, - right alt plus keypad digit, - right alt plus different keypad digit until decimal character code is entered, - no keys pressed - and every current OS driver will do the same thing with it and inject the Unicode character into the input stream.

It uses more USB packets per character transmitted of course, but USB is fast and nobody will ever notice.

It won't work in the (hardware supported) mode used for bootup and BIOS config, of course; but that's not where users spend their time.

User avatar
DMA

17 Jun 2016, 19:48

Morituri wrote: It is true that the USB standard does not allow keyboards to do this by the usual means.

But every major operating system keyboard driver now uses the same means to enter extended Unicode characters (or at least can enable it), and the keyboard I propose can leverage that. Once it's decided what character you intend to send, if it's part of the normal keyboard mapping it can just do the normal thing. Otherwise it can send successive packets with - no keys pressed, - right alt, - right alt plus keypad digit, - right alt plus different keypad digit until decimal character code is entered, - no keys pressed - and every current OS driver will do the same thing with it and inject the Unicode character into the input stream.

It uses more USB packets per character transmitted of course, but USB is fast and nobody will ever notice.

It won't work in the (hardware supported) mode used for bootup and BIOS config, of course; but that's not where users spend their time.
Ah. So it's plain old macros. I'm a bit disappointed, frankly.
Are you sure it's the same way with every OS though?

As for the boot keyboard - just leave the normal scancodes alone. Also system kind of lets you know that it's in boot mode (at least it should, haven't read that part thoroughly), so boottime support should be doable.

User avatar
Ratfink

17 Jun 2016, 19:58

DMA wrote:
Spoiler:
Morituri wrote: It is true that the USB standard does not allow keyboards to do this by the usual means.

But every major operating system keyboard driver now uses the same means to enter extended Unicode characters (or at least can enable it), and the keyboard I propose can leverage that. Once it's decided what character you intend to send, if it's part of the normal keyboard mapping it can just do the normal thing. Otherwise it can send successive packets with - no keys pressed, - right alt, - right alt plus keypad digit, - right alt plus different keypad digit until decimal character code is entered, - no keys pressed - and every current OS driver will do the same thing with it and inject the Unicode character into the input stream.

It uses more USB packets per character transmitted of course, but USB is fast and nobody will ever notice.

It won't work in the (hardware supported) mode used for bootup and BIOS config, of course; but that's not where users spend their time.
Ah. So it's plain old macros. I'm a bit disappointed, frankly.
Are you sure it's the same way with every OS though?
It's definitely not the same in Linux. Here you enter Unicode characters by pressing Ctrl+Shift+U, typing the hexadecimal code point of the character you want, then pressing Enter.

User avatar
DMA

17 Jun 2016, 20:02

Ratfink wrote: It's definitely not the same in Linux. Here you enter Unicode characters by pressing Ctrl+Shift+U, typing the hexadecimal code point of the character you want, then pressing Enter.
dec-hex conversion is trivial, "linux mode" can be set via very special key combination (like, pressing 20 keys at once :) )
Adding other systems is also not hard - you have the codes you want to enter, changing output translation is trivial.

But yeah, I've expected something more elegant for "almost exploit".

Morituri

17 Jun 2016, 22:02

Uh, no, not "plain old macros." Putting real logic in the controller to do arbitrary character composition makes it generative - not individual macros tied to specific key combinations but something that can be built/determined under program control before it's sent.

It's not some obscure bug or method in the USB spec, no, but if you'd prefer "hack" rather than "exploit" I'll just smile and do it exactly the same way.

From the computer's end it'd look like plain old macros. But that's kind of the point of using on-keyboard programming rather than on-computer programming.

And, yeah, the conversion to hex is trivial for those who have their Linux input set that way, but on my Debian the alt-plus-decimal input method is a configurable option that can be enabled.

User avatar
DMA

17 Jun 2016, 22:47

Morituri wrote: Uh, no, not "plain old macros." Putting real logic in the controller to do arbitrary character composition makes it generative - not individual macros tied to specific key combinations but something that can be built/determined under program control before it's sent.
Adding FSM to macros doesn't turn them into something else.
Morituri wrote: From the computer's end it'd look like plain old macros. But that's kind of the point of using on-keyboard programming rather than on-computer programming.
From the host it would just look like you're typing very fast :)

But I'm more and more convinced that you need to build raspberry pi (or other "credit-card sized computer")-based keyboard controller (with USB interface though - no need for ethernet). With HDMI connector so that programming the keyboard does not involve superhuman effort. You need a display for that - and the on-controller software kind of protects from the OS support contingencies.

Morituri

17 Jun 2016, 23:32

Definition of macros - fixed, static, canned. Not what's happening. But I don't really care what you call it, so go ahead.

User avatar
Chyros

18 Jun 2016, 01:24

I still don't understand what any of you are talking about :p . Here's what my desk at work looks like. As you can see I have very little space, but it can still fit my ZKB-2 comfortably with plenty of space to move the mouse around in.

Image

User avatar
kbdfr
The Tiproman

18 Jun 2016, 07:12

Morituri wrote: From the computer's end it'd look like plain old macros. But that's kind of the point of using on-keyboard programming rather than on-computer programming.
You say "programming" but you mean "retrieval".
The point is not whether the "programming" itself happens "on the keyboard" (in fact it always does) or "on the computer",
but wether the configurations once programmed are stored "in the keyboard" or only remain "in a program",
i.e. whether ("retrieval") a keypress sends the programmed macro or a request to a program to execute the macro.

If the keyboard sends the programmed macro stored in its own memory (like Tipros do),
then it does not need any software and will work with any OS running on any computer.
In fact the computer will not even identify it as anything else than a standard keyboard.

HuBandiT

18 Jun 2016, 16:15

Just a few drive-by ideas:

Will you want/need to make it Turing-complete?

To make the description of the intended keyboard behaviour (henceforth "program") available to the keyboard ("programming"), I would transfer it as a file/files into the keyboard (through a removable card in an SD/microSD slot only removed for programming, or have the keyboard controller itself emulate a USB storage device, be it https://en.wikipedia.org/wiki/USB_mass_ ... vice_class or https://en.wikipedia.org/wiki/Picture_Transfer_Protocol).

Then carefully establish (develop and document) a well-defined, text-based format for the program. Rely on best industry practices (e.g. extensibility, versioning).

Then let people create tools that makes is possible to compose programs for the keyboard. If I would have to choose, I would go with client-side JavaScript in a browser as a widely avaiable (and hopefully enduring) platform to develop these tools on. Transferring the program back-and-forth between the program composer and the files destined for the keyboard could be done via the OS clipboard.

Comb through DT for all features of all controllers/converters, with a mind open for keyboards of all sizes, purposes and technologies, and see which of those features would make sense to include. Cater for keyboard technology-specific extensions as well (e.g. scanning parameters, capacitance calibration tables/parameters for capsense keyboards, debouncing parameters for mechanicals, triggering point settings for analogs (e.g. optical, capacitive), etc.). Maybe a sensible superset could be found that could then enable people to develop program composers able to work with all (or at least many) keyboard controllers, to break down the current silos and cross-pollinate ideas between groups, so that everyone can have the best of all worlds.

Make the file format modular (e.g. multiple files). This will enable people to compose behaviour from smaller, modular pieces, sharing such pieces; as well as enable keyboard controllers to save the results of continuous HW measurements (e.g. per-key bounce time statistics, measured per-key capacitance values, detailed key counting and timing measurements for key layout optimization, etc.) back into separate files to allow development of auto-calibration, optimizing keyboard layouts, etc. Have separate files for separate concerns (e.g. low-level HW-layer concerned with scanning and debouncing, through medium-level layer e.g. how to post computed unicode to particular OSes, up to higher abstraction layers e.g. what sequences of keys should do what).

HuBandiT

18 Jun 2016, 16:56

Also, consider the complexity (problem size) for full Unicode composition, and deisgn accordingly. Look into the Unicode standard, https://en.wikipedia.org/wiki/Combining_character, https://en.wikipedia.org/wiki/Unicode_equivalence among others I guess.

Furthermore, the Unicode standard is a living, evolving standard, so design for easy (preferably automatic) derivability of the relevant keyboard program sections/data tables from the Unicode and related standards.

Furthermore, for full coverage:

You may also want to design for https://en.wikipedia.org/wiki/Japanese_input_methods but this might be difficult to do in-keyboard unless you also add a display.

You may also want to design for chorded input.

One well-established form of chorded input would be in-keyboard machine shorthand (stenography) - e.g. https://en.wikipedia.org/wiki/Keyboard_layout#Plover , http://plover.stenoknight.com/, http://www.openstenoproject.org/ ; this in ideal would need analog sensing with on-the-fly user calibratable trip points, combined with a largish dictionary-based chorded input with on-the-fly dictionary amendments, and thereby also almost invariably require a display in the keyboard.

Morituri

19 Jun 2016, 00:00

I'm willing to treat unicode as having the vast majority of its useful bits fixed. The "evolving standard" is mostly filling in corners at this point.

The controller side programming is needed because it's not just a matter of emitting unicode codepoints. They have to be emitted in a way that makes well-formed, normalized grapheme clusters, regardless of the order in which the typist added features to the thing s/he was composing. If the user specifies something that doesn't come pre-composed (some combination of combining accents) the controller has to put them into canonical order, perform composition of the base character with any subset of accents it's available with, and emit the rest as combining modifiers in correct order. IOW, the user isn't just typing unicode codepoints; s/he's typing normalized NFC unicode text.

User maybe types N, then without releasing types _ twice and ~. Keyboard emits Ñ followed by combining double underscore because Ñ̳ isn't available as a precomposed letter.

Anyway, the normalization logic is the same no matter how users may map the keys themselves. The important idea here is generative combination of elements to make unicode characters and grapheme clusters out of the elements that are actually on the keyboard.

HuBandiT

19 Jun 2016, 03:24

That double-underscored capital Spanish n is beyond me, I'm from ISO-8859-2 land. :)

Hope you are not saying all you want to accomplish is this composing functionality? Are you not interested in "programmability" as in what other people would call "macros"? Or providing enough tools to satisfy the needs of reduced keyboard layouts? Or completely freely redefining what keys/buttons do?

Morituri

19 Jun 2016, 19:03

Oh yeah, I'm SURE people would need a macro pad with something like this. A really good one. For example if someone is writing a math paper where they're using Greek characters for variables, and combining double-overbars to mean it denotes a vector or something and some other weird composed character for a mathematical operator? They'd want to type those laborious-to-enter characters once, put them on a macro pad, and then call them up with a keystroke. Also there's a bunch of things that there's no really obvious way to type, like dingbats. If you actually use them for some purpose, then you'd want to put them on a macro layer and use your macro pad to access them.

And of course the gamers still want macro pads for perfectly timed rocket jumps, Immelman turns, and fake autofire bursts from non-automatic weapons or whatever.

But macro pads are well-understood. You make a recording, and you fire it off when you need it. They're not the new feature.

User avatar
kbdfr
The Tiproman

19 Jun 2016, 19:22

Morituri wrote: […] But macro pads are well-understood. You make a recording, and you fire it off when you need it. They're not the new feature.
At least they exist. That will help bridge the time until the new feature is operational :mrgreen:

Morituri

19 Jun 2016, 19:41

Darn right they do. If it wasn't for macro pads I'd be SO screwed when I write a paper.

User avatar
Ratfink

19 Jun 2016, 20:03

Morituri wrote: Darn right they do. If it wasn't for macro pads I'd be SO screwed when I write a paper.
What exactly is wrong with \newcommand? ;)

Morituri

19 Jun 2016, 21:00

Ah. A LaTeX user. Nothing wrong with it but I prefer plain text.

User avatar
eekee

20 Jun 2016, 19:34

DMA wrote:
eekee wrote: On the other hand, I expect I'll have a hard time programming it because USB is a huge pain; far more complicated than it needs to be.
LUFA makes it much less pain, actually. You will still need to write descriptors, but that's not that hard - enough keyboard ones circulate around.
Good to know!
DMA wrote:
eekee wrote: SD, like USB, like gigabyte-size onboard Flash, is a pain. Basically, there are no sane standards any more. They all require far more work than is necessary, and have sections which are hard to understand or downright vague. Still, it's likely there is reasonable example SD-card interfacing code available for Launchpads and/or similar boards.
Don't forget you need to drag in the filesystem support if you're going SD/USB host way! Forgot to write about that. Doable, probably has some frameworks that make it less painful even - but expect it to eat a lot of flash and a fair chunk of RAM.
From what I gather, even FAT32 support is less work than USB itself. Have to get writing right, of course.
DMA wrote:
eekee wrote: Emulating a USB network device and the computer behind it is ok with some hardware. My Zaurus worked fine in this configuration, but Raspbery Pi version 1 in particular has nasty problems with this. I think it's an interrupt problem.
That was sarcasm! Though now I think that having the configurator as an app on the keyboard itself, so you can configure it by connecting external monitor or using SSH and running the app, would be a nice thing contributing to longevity of the device while saving development effort (you don't have to worry about portability, you have your own OS :)
..until HDMI connector goes the way of the Dodo, that is :)
Oh I saw your sarcasm, your emphasised "*and*" made it clear. I just thought it could work so well, and I made a mistake when looking at colorForth source code: it made it look simple. The LAN block is only 60 words long, counting `×` and `;`. Even the northbridge driver fits comfortably in one of colorForth's 46x30 screens. What I didn't see was the LAN block requires the serial port driver, there's no ethernet support. (Not in the colorForth image I use, anyway.)

It might actually be a good idea to run a 'real' operating system in the keyboard controller, perhaps Linux on a Raspberri Pi. (I was wrong about the Pi; discovered today the port my opinion was based on was made by idiots, copying and pasting code from different kernels without any idea how they fitted together.) At least, it probably would if you were going my route with VNC over USB ethernet, but it would still not entirely be a smooth ride.

On a lighter note, good idea of yours to send keypressing simulating unicode input, Morituri. I could even make that work trivially with Plan 9. On the other hand, Plan 9 is so simple it would be easy to replace the keyboard driver with something which reads UTF-8 straight from a USB serial device. :)

Morituri

27 Jun 2016, 00:39

If you want the proper usage/exploit of the USB standard itself, which is what I was originally thinking of, here it is. It would be proper, according to the USB HID specification, to define a (non-boot) report protocol including a field from usage page 16 (unicode codepoint) enabling the keyboard to directly and correctly report unicode characters. There is no need to wrestle the committee to change the HID specification.

The problem with that hack is that although it would be correct and proper for an operating system to support that usage, in its keyboard drivers, I doubt that there exist any operating systems at this time that have a single clue what to do with a Unicode character arriving from a UI device. Hence hackery with autogenerating keycodes to simulate Unicode input methods.

User avatar
y11971alex

27 Jun 2016, 05:15

Morituri wrote: Anyway, what is this "eight layers" crap? A gigabyte of nonvolatile RAM costs about a quarter, so why are macro boards limited to so many layers, so many keystrokes per macro? That doesn't make any damn sense.
Image

I suspect some of these chips must be memory chips of some kind. They look too uniform to be processor and communication chips. But the point seems to be that not every keyboard has that kind of internal space to store such a huge PCB that can accommodate a bank of DDRs.

User avatar
DMA

27 Jun 2016, 18:40

y11971alex wrote: I suspect some of these chips must be memory chips of some kind. They look too uniform to be processor and communication chips. But the point seems to be that not every keyboard has that kind of internal space to store such a huge PCB that can accommodate a bank of DDRs.
Those are buffers. They drive the lines.
One of them is demux which controls the buffers.

http://www.kbdbabel.org/schematic/kbdba ... at_kbd.pdf - pictured is the older version, but the principle is the same.

User avatar
Ratfink

28 Jun 2016, 05:30

y11971alex wrote:
Morituri wrote: Anyway, what is this "eight layers" crap? A gigabyte of nonvolatile RAM costs about a quarter, so why are macro boards limited to so many layers, so many keystrokes per macro? That doesn't make any damn sense.
[image removed to save vertical space]

I suspect some of these chips must be memory chips of some kind. They look too uniform to be processor and communication chips. But the point seems to be that not every keyboard has that kind of internal space to store such a huge PCB that can accommodate a bank of DDRs.
Okay, it would have taken a lot of space in the 1970s when that beamspring keyboard was made. Meanwhile in the present, MicroSD cards are dirt cheap and as small as a fingernail, hardly requiring a large keyboard to fit in multiple gigabytes of storage.

User avatar
y11971alex

28 Jun 2016, 07:44

Ratfink wrote:
y11971alex wrote:
Morituri wrote: Anyway, what is this "eight layers" crap? A gigabyte of nonvolatile RAM costs about a quarter, so why are macro boards limited to so many layers, so many keystrokes per macro? That doesn't make any damn sense.
[image removed to save vertical space]

I suspect some of these chips must be memory chips of some kind. They look too uniform to be processor and communication chips. But the point seems to be that not every keyboard has that kind of internal space to store such a huge PCB that can accommodate a bank of DDRs.
Okay, it would have taken a lot of space in the 1970s when that beamspring keyboard was made. Meanwhile in the present, MicroSD cards are dirt cheap and as small as a fingernail, hardly requiring a large keyboard to fit in multiple gigabytes of storage.
But it would be this size at least, wouldn't it?

Image

And the existing controller board still needs some room too.

Morituri

28 Jun 2016, 16:59

Actually it would be more this size:

Image

That big ol' square in the middle? That's a slot for an SD card, which is, yes, about the size of your thumbnail and even some pretty ordinary ones hold 8Gbytes. It's shown with an adapter for a micro SD card, which is only about a quarter of that size and some pretty ordinary ones of THOSE hold 8Gbytes. I think Sparkfun even sells an 8Gbyte micro SD card with a complete linux installation on it.

Anyway, that's an SD shield for Arduino, and costs $12 if you buy it on Amazon. It also works on a Teensy.

User avatar
y11971alex

28 Jun 2016, 19:54

:shock:

So controllers are excepted from older = better... got it! :D

duān-hiàn

06 Jun 2018, 13:21

Hello,
Have you achieved your goal? I will start to make a big keyboard with about 500 to 600 keys with Unicode characters. This number can't be reduced. Now I have two plans:
  1. Use only Keyboard Page 0x07. Utilize the unused range of Usage ID for key code report from 0xE8 to 0xFFFF. Then write 3 drivers for 3 systems (Windows, Linux, Mac).
  2. Use both Keyboard Page 0x07 and Unicode Page 0x10, assuming software can recognize surrogates 0xD800 to 0xDFFF. The length will vary for code points in Plane 1 onwards.
After discussion with other people, I think defining a new usage page using UTF-32 might be feasible which may be the 3rd way.

I also want to use this big keyboard in BIOS and DOS, but all these may not support boot-protocol.

Findecanor

06 Jun 2018, 16:33

Morituri wrote: Edit: I've hacked up a layout to show what I'm talking about. It's the same size as a couple gaming keyboards I've got, but 60 new keys are here. I've left them unlabeled except for the three above the arrow pad. (to allow mouse operations from the keyboard - though you might still want a real mouse for smooth control. )
If I were to design a keyboard that I wanted to sell, I would not mess with the inverse-T cursor key layout or have something else than Ctrl in the bottom corners: those are things that people often complain about when they are "wrong" for them.
I think also that F-keys are more often found by their order within one of the three four-key groups than by label.
Of course, if you are making it just for yourself, then my comment does not matter.

Post Reply

Return to “Keyboards”