Jenkins – Vision Paper
By Evan Jenkins
May 12, 2017
My glasses fell off my bed as my alarm rang and vibrated my makeshift night table. I had a real one shipped from IKEA, but I didn’t have a hammer to actually build it. I reached for my glasses and put them on, my eyes watering from the slow start-up screen animation. The electroencephalography, or EEG, Arduino sensors implemented in my glasses began to heat up. I remembered those old bone conduction headphones and smirked, the weirdest things end up making history.
The display came to life and I clicked away the unimportant messages like how much sleep I got, my dream memory video, and my fitness goal for the day. You can use the glasses in EEG only mode, or vision-tracking mode combined with EEG. I chose the latter, because it worked much more like the traditional computer I was familiar with and had less bugs.
The display now only held the most important information: my job for the day and the time, 9:45 A.M., September 26th, 2022. I looked towards the RFID codes on my wall, still having to be manually labeled with sticky notes. The shower turned on and the lights in my room moved to the second dimmer setting, giving me time to get my bearings. I checked to make sure I had plugged in my battery the night before, because there’s still no batteries that last more than a work day. It was plugged in, so I made my way to the bathroom to shower.
Walking outside, my self-driving car pulled out of the garage. I opened the back door and slid inside, removing my backpack on the way and placing it next to me. The door closed and locked as the car got to 9 MPH. I always wondered why they chose that number, but it makes sense if you don’t think about it too hard, so it stuck.
My location for the day appeared on the screen in-front of me, Snæfellsnes National Park in Iceland. The client was a local clothing company, a hip and trendy start-up marketed towards 20-somethings. Hopefully they wouldn’t be there. I had the option to enter 360° mode, transferring from the 2D screen to a virtual experience in the location. I clicked it. I haven’t had enough time in VR recently. The EEG sent electric signals back into my head to simulate warmth and wind. It was fun but it used too much battery to not do it plugged in, I turned the mode off.
I met my assistants at the airport, the face recognition software getting me quickly through security. The TSA were a shell of their former organization, but it might be better that way. All the lights and light modifiers were there, I had my camera, and we met the clothes and model there.
The flight took an hour and a half, enough time to take a mid-morning nap and mentally prepare for the day. The art director and model were at the airport waiting for us with the clothing options for the day. The art director hired by me, and the model was probably some friend of the start-up owner who couldn’t find a job after University. We took two cars to the park, and my assistants started getting lights and clothes ready.
I liked to do a camera test before every shoot, to make sure I didn’t look dumb in front of anyone. I was still new to this camera and I wanted a quick refresher. The DJI Inspire 5s Mini came to life and unfolded out of its case, the first-person view appearing on my glasses display. The EEG sensors became even warmer as they turned on their fans. I still haven’t installed liquid cooling. I need to remember to make a note.
Up. Down. Left. Right. Pan. Follow mode. Stop. Return. Stop. Orbit. Stop. Down. Everything seemed in order. The crew was ready, it was time to put it in action, everybody was watching.
It was rather easy, in hindsight. The system showed me the best angle to account for sunlight and my lights. My drone camera flied in the preset pattern, before showing me a grid of the images and prompted me for different angles and gestures. An edited version was sent to the model’s HUD. I made a preset that just increased the contrast and made the colors vibrant yet faded. It was the closest thing I could get to making it look like Instagram. It usually encouraged them to continue without speaking to me, I just didn’t like small talk.
The clothing start-up only asked for five images, and my memory counter showed me that I had 89 to choose from, so I figured I’d do 11 more photos and call it a day. None of the photos were bad, the auto-stabilizer and Auto-Intelligent-focus made sure of that, but some were definitely better than others.
The airplane ride back was the time to sort and choose the photos, and edit if I had time. The glasses display made it easy to figure out which ones were keepers and which could go away. The clothing company only asked for three, but if I sent five, then they would think that they had options to choose from, and that they were the ones selecting the photos. I chose six and would delete one later. I uploaded a preset, imported them and retouched in Photoshop and just had enough time to export them before stepping off the plane. I would have to select the one to delete before we got to the office, five was okay, but six made them expect more every time.
I got to the office, and uploaded the photos to the proximity-based network. Unfortunately, the proximity system meant I had to be at the office to upload work, a drawback of the faster upload speed. An email popped up on my display. The photo editor received the photos and is going through them. He still hasn’t figured out that I always send them from the room next door. I’m waiting to see how long it’s going to take, considering we don’t allow uploads from outside the proximity net. He’ll edit them, even though I already did, and send them to the clients.
The client will choose the best photo and it will run on their billboard as well as their magazine update for that day. Nobody prints magazines or newspapers anymore, everyone just uses their E-MAGs, tablet like devices that mimic the page flipping fun of skimming through a magazine, but with AMOLED screens. Users can subscribe to magazines that suit their interests, or have the E-MAG create a personalized zine for one’s daily dose of news and fashion. Of course next to each photo is a code that allows users to see what the outfit would look like on themselves and have the option to ship it with next day delivery, which is standard.
I checked the time, it was 5:30 p.m., just in time to have a small break before I begin my second adventure of the day. I took the car home and had a quick early dinner consisting of beefy mac and cheese and asparagus. I plugged my computer into the dongle that attaches to my glasses, the EEG sensors and transmitters fired into desktop mode. I launched the application, “Silent Glade.” It was the newest MMORPG created by the gaming division of Google who, in 2020, acquired most of Silicon Valley. My second job was as social media representative and in-game photographer for Google’s ‘Silent Glade’ team. The game was still in beta testing, but I had free access for 3 months already, to create promotional materials.
The fantasy game world sprang to life in-front of me. The EEG sensors combined with desktop VR mode allowed the game world to appear almost as real as real life, which was quite the feat in modern technology. I checked my log of Google+ posts and cross-referenced it with the list of game regions. I hadn’t been to the Forest of Xak yet, I set off in that direction after materializing my horse from the menu. It would take about 5 in-game days to get there, which translates to about 5 minutes in real life, though it will still feel like 5 days to players.
I arrived at the site in the Forest of Xak. I had picked out a location on the map that overlooked most of the forest, a nice sunset would peek over the treetops just past the river that ran through the middle of the zone. Sunset came, I took the picture and prepared the Google+ post.
“The Forest of Xak, just one of the playable zones coming to players in Silent Glade. Will you conquer the forest?”
I posted and auto-shared the photo across my social accounts. I took a few more photos on my way back down the path, pretty pictures to send with my replies to questions or trolls in the comments section.
I replied to comments for 30 minutes and then logged off. It was Monday, and that meant my favorite show was being aired on Google Channel 5 at 8:30. I would have plenty of time to continue my adventures in Silent Glade later that night when my friends got home from work. I unplugged the dongle from my glasses, and turned to Channel 5 just in time for the opening credits.