Closest Thing to a Replicator

The radical judges were put in place to do a job, and they are doing it. Every woman I know is mad as hell, and every man with more understanding than a brick is also. Not to mention that the freshly-packed Trump Supreme Court (TSC) immediately went chipping away at the separation of Church and State, and Thomas has proclaimed they’ll take on Marriage Equality (as in attack) and contraception later. Everyone knew this was coming, and I’m only surprised it took this long, and that’s all I’ll say on the matter…for now.

I want to talk about happier things, or at least something else. The Seventh Law of Power is going along fine (new character(s) coming in soon), and I bought myself a late birthday present: a 3D Printer.

I always thought one of the coolest things in later Star Trek was the notion of the Replicator. More magic than technological so far as I was concerned. A thing in the wall and you ask it for precisely what you want, and if you get the request in the proper form (Earl Grey, Hot), you get it…mostly. Tell me that’s not a Genie granting wishes, though I’m a little surprised the thing didn’t try to create Earl Grey on fire. It was very literal.

Those of you who have experience with 3D Printers already know they have a lot in common with the Replicator, if far less capable. You have to know what you want, express it precisely enough that a slicer app can turn the image into g-code instructions the device understands, and the thing makes it for you…if nothing goes wrong. Though calling it a printer is a bit of a misnomer. It’s more of a mini additive manufacturing device in this context, though I have seen versions large enough to make a house.

Anyway, once I had it assembled (which is another story. Supposed to take an hour. Took most of the day), I took my first stab at it just to see if I had put the thing together right, using an existing file for the test. You can see the result above. Not bad for a first attempt, if a little fuzzy around the ears.

Next I’m going to get my feet wet in CAD software so I can make my own designs. Because.

75% Chance of Human

Photo by Kindel Media on Pexels.com

I’m almost done with the introductory AI course, with three classes left. Also almost done with Chapter 10 of The Seventh Law of Power, though the two are not related, other than serving as evidence I can still multitask. Marta has just had to deal with a suspiciously inept assassin and I’ve been deep in the weeds with Mediapipe and Tensorflow.

We did a series on facial recognition early in the course, with pictures of known politicians (posed) for training data and more candid shots for the recognition bit. The software was amazingly accurate. In the last few lessons we’ve moved more into facial and gesture recognition in the sense of spotting what is and isn’t a face on a more fundamental level than simple ID, and recognizing gestures.

In a recent lesson, the goal was to make the program recognize a face on camera, and draw key index points on that face, using our own as the target. While we were looking at the raw data, one of the first parameters to show up is how confident the program is that “this is a human face” with the probability given as some fraction of one. Our instructor was fussing that the routine never gave his own face a probability higher than .93, or 93% confidence.

Mine was never more confident than 75%.

When it was time to identify gestures, you can guess which one I chose.

A Very Fast Idiot

Photo by cottonbro on Pexels.com

Back in college my major was Polymer Science. Plastics, resins, that sort of thing. Even so, it was deemed necessary for undergrads in technical fields to gain some exposure to computers and programming.

Bear in mind, this was still early days in the computer revolution. Personal computers did not exist, nope, not so much as a Trash 80, though the Apple I was already on its way. What passed for a small computer was a DEC PDP-11, about the size of a large filing cabinet. Paper tape. Punch cards, which were bloody awful. Later, if you were lucky, a dumb CRT terminal. If you weren’t, a paper teletype machine. Our campus system was a XEROX mainframe (Sigma 9), 64K main memory (that’s K, not G or even M). The system took up an entire (and very large) room. Have I dated myself well enough? I should say so.

I knew nothing then, thinking computers were something almost magical a la Star Trek. Not these computers. You had to tell them everything, and I do mean everything, in precise instructions, in order, and they would do what you told them and nothing else. Problem was, what you think you told them wasn’t always what you actually told them, and since we were the last generation running batch jobs it sometimes took a long wait before you knew you’d messed up. They were, as the “elves” in charge of the Sigma referred to them, “very fast idiots.”

I loved it. One of the biggest regrets of my misspent youth was I didn’t change my major in my first year. Regardless, cut to the present. Narrow AI is progressing by leaps and bounds and used everywhere (not always a good thing); general AI is either imminent or impossible, depending on who you ask. I’m taking online classes in machine learning because I can and I want to.

My last lesson was was on gesture recognition. You film yourself on a webcam, and using prefab learning model libraries, teach the computer to recognize a human hand. For practice we created an updated version of Pong, only this time you point at the top of the screen and the computer has to know to put the paddle where you’re pointing to intercept the ball. So far so good, only mine was creating a paddle when my hand wasn’t even in the shot. Took me a moment to realize why: it was interpreting the headstock of my Peavey Predator hanging on the wall as a hand.

Still doing exactly what it’s told, if not exactly what you intended. Still a very fast idiot. I’m not holding my breath on that general AI thing.