Are Prompt Engineering Jobs at Risk because of A.I.?

As you know, many AR/MR developers left the field last year to become prompt engineers. I heard some bad news today, though. Apparently A.I. is starting to put prompt engineers out of work.

A.I. is now being used to take normal written commands — sometimes called “natural” language – sometimes called by non-specialists “language” — and processing it into more effective generative A.I. prompts. There’s even a rumor circulating that some of the prompts being used to train this new generation of A.I. come from the prompt engineers themselves. As if every time they use a prompt, it is somehow being recorded.

Can big tech really get away with stealing other people’s work and using it to turn a profit like this?

On the other hand, my friend Miroslav, a crypto manager from Zagreb, says these concerns are overblown. While some entry level prompt engineering jobs might go away, he told me, A.I. can never replace the more sophisticated prompt engineering roles that aren’t strictly learned by rote off of a YouTube channel.

“A.I. simply doesn’t have the emotional intelligence to perform advanced tasks like creating instructional YouTube videos about prompt engineering. Content creation jobs like these will always be safe.”

Learning to Program for the Apple Vision Pro

Learning to program for the Apple Vision Pro can be broken into 4 parts:

  • Learning Swift
  • Learning SwiftUI
  • Learning RealityKit
  • Learning Reality Composer Pro (RCP)

There are lots of good book and video resources for learning Swift and SwiftUI. Not so much for RealityKit or Reality Composer Pro, unfortunately.

If you want to go all out, you should get a subscription to https://www.oreilly.com/ for $499 per yr. This gives you access to the back catalogs of many of the leading technical book publishers.

Swift

To get started on Swift, iOS Programming for Beginners by Ahmad Sahar is pretty good. iOS 17 Programming for Beginners though the official Apple documentation will get you to the same place: https://developer.apple.com/documentation/swift

If you want to dive deeper into Swift, after learning the basics, the Big Nerd Ranch Guide, 3rd edition, is a good read: Swift Programming: The Big Nerd Ranch Guide

But if you want to learn Swift trivia to stump your friends, then I highly recommend Hands-On Design Patterns with Swift by Florent Vilmart and Giordano Scalzo: Hands-On Design Patterns with Swift

SwiftUI

SwiftUI is the library you would use to create menus and 2D interafaces. There are 2 books that are great for this, both by Wallace Wang. Start with Beginning iPhone Development with SwiftUI and once you get that down, continue, naturally, with Wallace’s Pro iPhone Development with SwiftUI .

RealityKit

Even though Apple first introduced RealityKit in 2017, very few iPhone devs actually use it. So even though there are quite a few books on SceneKit and ARKit, RealityKit learning resources are few and far between. Udemy has a course by Mohammed Azam that is fairly well rated. It’s on my shelf but I have yet to start it. Building Augmented Reality Apps in RealityKit & ARKit

Reality Composer Pro

Reality Composer Pro is sort of an improved version of Reality Composer and sort of something totally different. It is a tool that sits somewhere between the coder and the artist – you can create models (or “entities”) with it. But if you’re an artist, you are probably more likely to import your models from Maya or another 3D modeling tool. As a software developer, you can then attach components that you have coded to these entities.

There are no books about it and the resources available for the previous Reality Composer aren’t really much help. You’ll need to work through Apple’s WWDC videos and documentation to learn to use RCP:

WWDC 2023 videos:

Meet Reality Composer Pro

Explore materials in Reality Composer Pro

Work with Reality Composer Pro content in Xcode

Apple documentation:

Designing RealityKit content with Reality Composer Pro

About Polyspatial

If you are coming at this from the HoloLens or Magic Leap, then you are probably more comfortable working in Unity. Unity projects can, under the right circumstances, deploy to the VisionOS Simulator. You should just need the VisionOS support packages to get this working. Polyspatial is a tool that allows you to convert Unity shaders and materials into VisionOS’s native shader framework — and I think this is only needed if you are building mixed reality apps, not for VR (fully immersive) apps.

In general, though, I think you are always better off going native for performance and features. While it may seem like you can use Unity Polyspatial to do in VisionOS everything you are used to doing on other AR platforms, these tools ultimately sit on top of RealityKit once they are deployed. So if what you are trying to do isn’t supported in RealityKit, I’m not sure how it would actually work.

Four Days From Election Day

aviation

This is the Aviation cocktail. It is one of the earliest recorded cocktails of the 20th century. Just sourcing the ingredients can be a feat in itself.

  • 2 oz gin
  • 1/2 oz lemon juice
  • 1/4 oz maraschino liqueur
  • 1/4 oz creme de violette

Shake with ice and pour into a martini glass. Garnish with a lemon twist or a maraschino cherry. There is some controversy about the creme de violette. Some people leave it out altogether, on the theory that modern versions of the liqueur are inferior to that used in the original drink. Some throw it in with the rest of the ingredients in the shaker, which creates a very purple drink. I like to add it over a spoon after pouring the glass so it is in a separate layer, creating a morning dawn with clouds effect, which is how the drink originally got its name.

The disputes over the right way to make an Aviation follow a long term (or long form) mode of thought. This is unusual and increasingly rare. Typically we make short term decisions, and this has been blamed many of the follies we face today.

Stock market investment is meant to be a long term matter but many of the disasters that occur in the market seem to occur when people treat it as a short form (i.e. gambling) metier. There was a time, as with the characters in a Jane Austen novel, when a person’s worth was calculated based on a 5% return on investments. Mr. Darcy was worth 10,000 pounds a year, which meant he had an endowment of 200,000 pounds. 10,000 pounds a year put Mr. Darcy in the top 1% of British incomes at the time. His 10,000 pounds is equivalent to around £450,000 today, according to a quick unverified Google search I just did. His modern equivalent, I imagine, would consequently be Jared Kushner, not Colin Firth, which makes Ivanka Trump our Elizabeth Bennett!

Okay. Enough of that.  Short term thinking vs. long term thinking. That is the current topic.

I once had a manager to whom I expressed my concerns that the path we were on in building a software product simply wouldn’t work. There was no audience for it. I was concerned that my manager was hiding information from the CEO of the company and that this would lead to disaster down the road, including but not limited to everyone losing their jobs. My manager calmly told me with a smile that this was something I shouldn’t worry about and that if his strategy didn’t work, it was his job that would be forfeit, not mine.

I think he was sincere in saying this, to the extent any manager is capable of sincerity (I’ve known a few), but the problem was that this was short term thinking. In the short term, he was confident that what he was saying was true. Later, however, adverse circumstances led to a shortfall in income and I moved on to other employment while he continued doing internal pitches in order to get more money for his project. He of course forgot about any claims he had made previously.

There are several lessons that can be drawn from this. The first is never to trust  management. They are not on your side. Their job is to figure out how to get you to further their own goals.

The second is that something said can be true in the short term but not true in the long term. In the short term, people will say whatever gets them through to the end of the meeting they are in. This is what we also call a pragmatic attitude.

Statements concerning actions that are true over the long term are actually called “ethics”. When someone claims they will do something, and perhaps believe it, but after a few days or a few weeks, abandons that promise, then they are being unethical. If the keep to their word over the long term, they are being ethical. A characteristic of people who keep their word over the long term is that they are thoughtful about what they say and what they promise.

Even the beliefs we hold can be ethical or unethical in this way. I was once serving jury duty in a case that was pretty fun – about which I can’t really say anything. Most of the jury was inclined one way while two were inclined another. At the end of the deliberations, one of the hold outs was eventually ready to change his vote because he had a party he wanted to go to that night. So it was down to the fore-person. She made multiple arguments about how it was possible that the person on trial may not have done what he was accused of. And honestly her intentions were good. She was concerned about the three-strikes mandate that at the time would have given the defendant an excessive punishment, and many juries at the time were struggling with the notion of jury nullification in cases where they felt the criminal justice system itself was unfair. It occurred to me to ask her, out of curiosity, whether she believed what she was saying.

After which there was a long pause of at least a minute and maybe more. She then announced that she was changing her vote and we returned to the courtroom to announce our verdict.

“Is that true” is a powerful question, I discovered that day.

In life, we all say things that sound good at the moment. When I supported pitches to the client while working at a digital agency, this was what we did every day. We said things that sounded good. Strangely, we always believed what we were saying when we said it. Following the George Costanza rule, it isn’t a lie if you believe it.

But what we believe in the moment isn’t the same thing as an ‘ethical belief’. It can be blamed on social media, the lowering of public discourse, the long term effects of Trumpism, but it feels like people no longer believe or speak ethically anymore. There is no sense that the things we say should be true or that the promises we made should be kept. It’s all just words …

And words can destroy lives, markets, norms,  social bonds and potentially great nations. My hope for the Biden presidency is that we will finally have ethical beliefs, again.

HoloLens 2 announcement Quick Recap

Basic sensors look the same between HL1 and HL2, which is nice. Still 4 monochrome cameras for SLAM, 1 TOF for spatial mapping. For me personally, knowing the size of the battery and the refresh rate of the TOF with more consistent power is really huge. Also curious about burn in from the battery-bun. Doesn’t it get hot? And the snapdragon 850 is just an overclocked 845. That’s going to be a bit hotter too? Also curious why MEMS? It’s not cheaper or lighter or much smaller than LCOS so, I think, must be the reason for the FOV improvement (right?) but  Karl Guttag complains this will make for more blurry digital images (complain? who Karl?).

Piano playing stage demo was cool but there was noticeable awkwardness in the way the hand gestures were performed (exaggerated movements) — which is actually really familiar from working with magic leap’s hand tracking. Either that needs to get better by the summer or there’s going to be some serious disappointment down the road.

Anyone seen any performance comparisons between the NVidia Tegra X2 (Magic Leap’s SOC) and Qualcomm 850 (HoloLens 2’s SOC)? I know the 845 (basically same as 850) was regarded as better than X1, but most assumed X2 would leapfrog. Haven’t found anything conclusive, though.

Doubling of FOV between HL1 and HL2 might have been miscommunicated and meant something different from what most people thought it meant. It turns out the FOV of the HL2 is actually very close to the FOV of the Magic Leap One, both of which are noticeably bigger than the original HoloLens but still significantly smaller than what everyone says they want.

My friend and VR visionary Jasper Brekelmans calculated out the HL2 FOV in degrees to be 43.57 x 29.04 with a diagonal of
52.36. Magic Leap is 40 x 30, with a diagonal of 50 (thank you magic leap for supporting whole numbers).

From now on, whenever someone talks about the field of view in AR/VR/MR/XR, we’ll all have to ask if that is being measured in degrees or steradians. Oh wells.

Internet bad boy Robert Scoble turns out to probably have one of the most interesting takes on the HoloLens 2 announcement. I hope his rehabilitation continues to go well. On the other hand it was a really bad week for tech journalists in general.

Unity even made an announcement about HoloLens 2 and even though they have been working on their own in-house core tools for XR development, way down deep in the fine print they are saying that you will need to use the Mixed Reality Toolkit v2 to develop for HoloLens – which is very eeeeenteeresting.

Coolest thing for me, outside of the device itself, was the Azure Spatial Anchors announcement. No one is really paying attention to Azure Spatial Anchors, yet, but this is a game changer. It means implementing Vernor Vinge’s belief circles. It means anyone can build their own Pokemon Go app. And it works on ARKit, ARCore and HoloLens –> so the future of XR is cross-platform.

Mike Taulty, who I can’t say enough great things about, as usual has dived in first and written up a tour of the new service.

Crap, my boss just came back from lunch. Gotta work now.

Why you should watch “2047: Virtual Revolution”

In the wake of Apple’s successes over the past decade, agencies and technical schools like the Savannah College of Art and Design have been pumping out web designers and a career ecosystem that supports them as well as a particularly disciplined, minimalist, flat aesthetic that can be traced back to Steve Jobs. One of the peculiarities of the rise of 3D games and VR/AR/MR/XR platforms is that these children of the Jobs revolution have little interest in working with depth. The standards of excellence in the web-based design world — much less the print-based design world it grew out of – are too different. To work in 3D feels too much like slumming.

br

But design is still king in 3D as well as on the web. When the graduates of SCAD and similar schools did not embrace 3D design, film FX designers like Ash Thorp and Greg Borenstein jumped into the space the left vacant. For a period after the release of Steven Spielberg’s Minority Report in 2002, there was a competition among FX artists to try to outdo the UIs that were created for that film. From about 2010, however, that competitive trend has mellowed out and the goal of fantasy UX in sci-fi has changed into one of working out the ergonomics of near-future tech in a way that makes it feel natural instead of theatrical. By carefully watching the development of CGI artifacts in the 21st century – as an archeologist might sift through pottery shards from the ancient past –, we can see the development of a consensus around what the future is supposed to look like. Think of it as a pragmatic futurism.

co

The 2016 French film 2047: Virtual Revolution, written and directed by Guy-Roger Duvert and staring Mike Dopud, is a great moment in the development of cinematic language in the way it takes for granted certain sci-fi visual cliches. Even more interesting is what appears to be a relatively low-budget film is able to pull off the CGI it does, indicating a general drop in price for these effects. What used to be hard and expensive is now within reach and part of the movie making vernacular.

quit

The story is about a future society that spends all its time playing online RPGs while the corporations that run these games have taken over the remnant of the world left behind. But what I found interesting was the radial interface used by players inside their RPGs.

mlui

It bears a passing similarity to the magic leap OS navigation menu.

mech

It also gives a nod to popular gaming genres like Gandam battle suits …

cd

Fantasy UX (FUX) from films like Blade Runner and The Matrix …

wow

And World of Warcraft.

Play the movie in the background while you are working if you don’t have time to get into the plot about a dystopic future in which people are willingly enslaved to VR … blah, blah, blah. But look up from time to time in order to see how the FX designers who will one day shape our futures are playing with the grammar of the VR\AR\MR\XR visual language.

The effects for 2047 appear to have been done by an FX and VR company in France called Backlight. The movie is currently free to Amazon Prime members.

For some more innovative FUX work, please take a look at 2013’s Ender’s Game or the Minority Report tv series from 2015.

Mixed Reality As An Art Form

First a homework assignment and then a request. Please watch a few of these http://kogonada.com/ video essays on cinema.

I’d like to put together something like this for AR and VR. The basic idea is to take something small in [AR | Cinema | Sci Fi television] and then develop the topic into a conversation. But I need your help with ideas.

For instance, consider the flick in Sci Fi movies but also in Hawaii Five-Oh and in the Avengers movie used to send a piece of digital media from one screen to another. Where did it come from? How did it develop as a gesture? Has it changed over time? Where is it used in real world apps? Does it make sense as good UX or is it pure Sci Fi stuff?

Alternatively, what is the the relationship between John Carpenter’s They Live (long overdue for a big budget remake), The Matrix and spatial computing?

Alternatively, what does Beaudrillard’s ‘desert of the real’ tell us about VR?

Alternatively, what sort of H-C interactions do we need before head-mounted-displays could begin to compete with smartphones as portable computers and begin to fulfill their financial promise.

So I need help with ideas to explore, in essay style, with suggestions for media that can be used to explore.

Once we have enough, and if there is enough interest, I’ll work with you on producing these essays as video essays and we’ll put together an online book about MR, with each of you as an essay author and providing narration.

It’s at minimum a six month project but something I feel the world needs and we are uniquely positioned to create. I’m a little tired of both book writing and recording technical videos when good design thinking around MR is what we’re most lacking in right now.

So, my dear readers, have you got any ideas that you have been keeping to yourselves about the design and philosophy behind MR but needed help expressing?

My VRLA Shopping List

vrla

I meant to finish this earlier in the week. I spent the past weekend in Los Angeles at the VRLA conference in order to hear Jasper Brekelmans speak about the state of the art in depth sensors and visual effects. One of the great things about VRLA is all the vendor booths you can visit that are directly related to VR and AR technology. Nary a data analytics platform pitch or dev ops consulting services shill in sight.

Walking around with Jasper, we started compiling a list of how we would spend our Bitcoin and Ethereum fortunes once they recover some of their value. What follows is my must-have shopping list if I had so much money I didn’t need anything:

1. Red Frog HoloLens mod

IMG_3301

First off is this modified HoloLens by Red Frog Digital. The fabrication allows the HoloLens to balance much better on a user’s head. It also applies no pressure to the bridge of the nose, but instead distributes it across the user’s head. The nicest thing about it is that it always provides a perfect fit, and can be properly aligned with the user’s eyes in about 5 seconds. They designed this for their Zombie Maze location-based experience and are targeting it for large, permanent exhibits / rides.

2. Cleanbox … the future of wholesome fun

cleanbox

If you’ve ever spent a lot of time doing AR and VR demos at an event, you know there are three practical problems you have to work around:

  • seating devices properly on users’ heads
  • cleaning devices between use
  • recharging devices

Cleanbox Technology provides a solution for venue-based AR/VR device cleaning. Place your head-mounted display in the box, close the lid, and it instantly gets blasted with UV rays and air. I’d personally be happy just to have nice wooden boxes for all of my gear – I have a tendency to leave them lying on the floor or scattered across my computer desk – even without the UV lights.

3. VR Hamster Ball

IMG_3012

The guy demoing this never seemed to let anyone try it, so I’m not sure if he was actually playing a hamster sim or not. I just know I want one as a 360 running-in-place controller … and as a private nap space, obviously.

4. Haptic Vest and Gauntlets

haps (2)

Bhaptics was demoing their TactSuit, which provides haptic feedback along the chest, back, arms and face. I’m going to need it to go with my giant hampster ball. They are currently selling dev units.

5. Birdly

birdly

A tilt table with an attached fan and a user control in the form of flapping wings is what you need for a really immersive VR experience. Fortunately, this is exactly what Birdly provides.

6. 5K Head-mounted Display

5KVR

I got to try out the Vive Pro, which has an astounding 2K resolution. But I would rather put my unearned money down for a VRHero 5K VR headset with 170 degree FOV. They seem to be targeting industrial use cases rather than games, though, since their demo was of a truck simulation (you stood in the road as trucks zoomed by).

7. A globe display

globe_display

Do I need a giant spherical display? No, I do not need it. But it would look really cool in my office as a conversation piece. It could also make a really great companion app for a VR or AR experience.

8. 360 Camera Rig with Red Epic Cameras

redrig

Five 6K Red Dragon Epic Cameras in a 360 video rig may seem like overkill, but with a starting price of around $250K, before tripod, lenses and a computer powerful enough to process your videos – this could make the killer raffle item at any hi-tech conference.

9. XSens Mocap Suit

According to Jasper, the XSens motion capture fullbody, lycra suit with realtime kinematics is one of the best available. I think I was quoted a price something like $7K(?) to $13K(?) Combined with my hamster ball, it would make me unstoppable in PvP competitive Minecraft.

10. AntVR AR Head-mounted display

antvr

AntVR will be launching a kickstarter campaign for their $500 augmented reality HMD in the next few weeks. I’d been reading about it for a while and was very excited to get a chance to try it out. It uses a Pepper’s ghost strategy for displaying AR, has decent tracking, works with Steam, and at $500 is very good for its price point.

11. Qualcomm Snapdragon 845

qualcomm

The new Qualcomm Snapdragon 845 Chip has built-in SLAM – meaning 6DOF inside-out tracking is now a trivial chip-based solution – unlike just two years ago when nobody outside of robotics had even heard of SLAM algorithms. This is a really big deal.

Lenovo is using this chip in its new (untethered) Mirage Solo VR device – which looks surprisingly like the Windows Occluded MR headset they built with Microsoft tracking tech. At the keynote, the Lenovo rep stumbled and said that they will support “at least” 6 degrees of freedom, which has now become an inside joke among VR and AR developers. It’s also spoiled me, because I am no longer satisfied with only 6DOF. I need 7DOF at least but what I really want is to take my DOF up to 11.

12. Kinect 4

kinect4

This wasn’t actually at VRLA, and I’m not ultimately sure what it is (maybe a competitor for the Google computer vision kit?) but Kinect for Azure was announced at the /build conference in Seattle and should be coming out sometime in 2019. As a former Kinect MVP and a Kinect book author, this announcement mellows me out like a glass of Hennessy in a suddenly quiet dance club.

While I’m waiting for bitcoin to rebound, I’ll just leave this list up on Amazon for, like, in case anyone wants to fulfill it for me or something. In the off chance that that actually comes through, I can guarantee you a really awesome unboxing video.

Older but not wiser

In late December I tried making some infrastructure changes to my blog, which is hosted on Microsoft Azure, and managed to hose the whole thing. Because I’m a devotee of doing things the long way, I spent the next two months learning about Docker containers and command line tools only to discover that Docker wasn’t my problem at all. There was something wrong with the way I’d configured my Linux VM and something to do with a button I’d pressed without looking at the warnings as closely as they warranted.

Long story short, I finally just blew away that VM and slowly reconstructed my blog from post fragments and backups I found on various machines around the house.

I still need to go through and reconstruct the WordPress categories. For now, though, I will take a moment to pause and reflect on the folly of my technical ways.

My problem with More Personal Computing as a Branding Attempt

We all know that Microsoft has had a long history with problematic branding. For every “Silverlight” that comes along we get many more confusing monikers like “Microsoft Office Professional Plus 2007.” As the old saw goes, if Microsoft invented the iPod, they would have called it the “Microsoft I-pod Pro 2005 Human Ear Professional Edition.”

While “More Personal Computing” breaks the trend of long academic nomenclature, it is still a bit wordy. It’s also a pun. For anyone who hasn’t figured out the joke, MPC can mean either [More] Personal Computing — for people who still haven’t gotten enough personal computing, apparently — or [More Personal] Computing — for those who like their technology to be intimate and a wee bit creepy.

But the best gloss on MPC, IMO, comes from this 1993 episode of The Simpsons. Enjoy:

MIXED REALITY ESSENTIALS: A CONCISE COURSE

On Saturday, October 29th, Dennis Vroegop and I will be running a Mixed Reality Workshop as part of the DEVintersection conference in Las Vegas. Dennis is both a promoter and trainer in Mixed Reality and has made frequent appearances on European TV talking about this emerging technology as well as consulting on and leading several high-profile mixed reality projects. I’ve worked as a developer on several commercial mixed reality experiences while also studying and writing about the various implications and scenarios for using mixed reality in entertainment and productivity apps.

Our workshop will cover the fundamentals of building for mixed reality through the first half of the day. Through the rest of the day, we will work with you to build your own mixed reality application of your choice—so come with ideas of what you’d like to make. And if you aren’t sure what you want to create in mixed reality, we’ll help you with that, too.

Here’s an outline of what we plan to cover in the workshop:

  1. Hardware: an overview of the leading mixed reality devices and how they work.
  2. Tools: an introduction to the toolchain used for mixed reality development emphasizing Unity and Visual Studio.
  3. Hello Unity: hands-on development of an MR app using gestures and voice commands.
  4. SDK: we’ll go over the libraries used in MR development, what they provide and how to use them.
  5. Raycasting – covering some things you never have to worry about in 2D programming.
  6. Spatial Mapping and Spatial Understanding – how MR devices recognize the world around them.
  7. World Anchors – fixing virtual objects in the real world.

Break for lunch

    8.  Dennis and I will help you realize your mixed reality project. At the end of the workshop, we’ll do a show and tell to share what you’ve built and go over next steps if you want to publish your work.

We are extremely excited to be doing this workshop at DEVintersection. Mixed Reality is forecasted to be a multi-billion dollar industry by 2020. This is your opportunity to get in at the ground floor with some real hands-on experience.

(Be sure to use the promo code ASHLEY for a discount on your registration.)