Vision Paper – Rickert
May 10, 2016
May 11, 2016
PAF: Memory of the Future
Memory is a strange aspect of human life. I think we can all agree on that, yes? There are short-term and long-term memory, augmented memory and misinformation effect, Alzheimer’s disease and dementia. For being something so fragile and suggestible there is a lot riding on the memory. Truth and reason are void without memory to solidify their accuracy. Emotions have no home base without memory. The body is no longer functional when memory disappears.
Would you believe me if I told you 300 years from today, the concept of a good memory is no longer necessary? It is true. Let me explain it to you.
2316 is the year the Personal Assistant Flyer (PAF) is released and mandated by the government to the general public. PAFs are small, noiseless modules that follow their masters at all times, day or night. PAFs record data of all sorts including video, sounds, smells, weather, emotions and interactions of their masters from an adjustable flying distance of three inches to ten feet. GPS units and sensors keep PAFs flying by their masters at all times, and an alert will sound and send a message to the government when the distance reaches more than 100 feet, because that means something might have gone wrong. PAFs are almost an extension of the brain and nervous system. PAFs and their masters are so intertwined that many masters name and decorate their PAFs to resemble their own looks.
PAFs are a required for every human beginning from the age of five onwards. When born, parents can buy their child a new PAF to start from scratch, or they can reprogram an existing PAF for their child by age five (I will explain this further in the report).
May 15th is record day. Every year, all functioning aspects of society stop so everyone can upload, organize and store their memories, thoughts, and personalities to their PAF. Normal functioning memories (like those we have in 2016) are continued to be stored in the brain, but accurately recorded, fact and data based memories from birth to death are stored in the PAF, accessible in seconds purely by thought. Data is stored in the PAFs forever, until intentionally reprogrammed.
This technology has been very positive for the general public. It helps combat diseases that affect memory, prevents memory loss from old age, and even provides an option of immortality. People no longer have to be tied to their phones and cameras to capture every moment of their life, because the PAF already does that for them.
The government has also found PAFs to be extremely valuable. They ensure accuracy in history and ‘truth’ because every aspect of life is recorded and filed in safekeeping. Crime prevention is at an all-time high because there is no way for criminals to escape the hard evidence to put them away for their crime because it will all be recorded.
However, PAFs have also created a social structure controlled by money and the power of memory. You see, every year on record day, people must connect and log their IQ, beauty, and health data markers to the world government systems. Those with higher IQs are automatically given better, higher paying jobs. Those with perfect health marks are given free health insurance and the price increases as health declines. And it is just human nature to want to not only be beautiful in one’s own skin, but as beautiful as possible.
This recording and ranking system has led to a black market of bodies. You see, because all functions of the brain, and the soul in some aspect, are stored in the PAF, the physical body is just an external representation of the human, but all personal qualities of that human being are stored in the PAF. It is possible to essentially switch bodies by connecting the PAF to a different body. Some masters do this because their body is unhealthy or physically unfit and they want to have a healthier body, some do not like the physically appearance of their body and want to switch to a body that they deem more beautiful, and some use this method to more or less never die. When the current body is reaching its age limit, the master can switch their PAF to someone willing to be bought out, or to a five-year-old body looking for a PAF and start their physical life over with their old memories, personality traits, and other non-tangible features intact. Sometimes high-quality (healthy and beautiful) bodies are sold for millions of dollars. The master being bought out would then give that money to their family and let their PAF be reprogrammed, essentially killing them. This practice had led to an unethical pattern of the 1% living forever by continuing to buy younger bodies until they no longer suit their needs and then letting those bodies die off to move on to yet again younger bodies. On the other side of the equation, people living in poverty are bidding against each other, driving down the prices lower and lower, until they are selling their PAFs off for fractions of what they were originally sold for. These people are sacrificing their lives to provide pennies in additional wealth for their families.
Additionally, PAFs have created controversy between the world governments and their citizens about privacy and what information is private versus what can be seen by the government and used for data purposes and/or sold to businesses for profit. Having something record your every move is quite unnerving, and at first, most people we’re terrified to live their lives freely and normally. It wasn’t until the government developed strict privacy laws that masters began to experiment with their PAFs, really use them to their advantage, and bond with them as their own minds. This is not to say there are not instances of distrust at times. For instance, the police have the right to search someone’s entire PAF if they have reason so believe they committed a crime. This seems like a sensible rule, except when it comes down to the discretion of police officers to choose who to search and who not to search. You would think with so much advancement in technology our social morals would have progressed as well, but sadly, PAFs have become the new tool used for racial profiling.
As with all technology, there are pros and cons with advancements made. Many developments provide for better, easier lives, or even create the ability to do something only ever dreamt of before. But there are also consequences of new technology, many times including privacy issues and who owns the rights to the data collected by the technology. The invention of Personal Assistant Fliers in 2316 was no different. You’ll see, someday. Until then, enjoy your freedom, enjoy your limited memory and look forward to the possibility of living forever through your mind and memory.
Drones + Me =? (Assignment 4)
May 7, 2016
As a Geography and Television dual major, I feel the options for drones in conjunction with my career are endless. Right now I’m interested in either working in the urban planning field, the television industry, or somehow combining both career paths for an awesome hybrid job.
In geography, I can see drones being a great asset in creating accurate maps of the world. I’m sure in the next few years there will be drones with long lasting batteries that can be used to study climate change, crops, and other physical changes over time in the world.
In the entertainment industry, there is already projected ideas about the use of drones in future. I found an article about the use of drones in the hit Netflix series Narcos, in which there is technology used to program drones to track and record an actor’s movements.
“Four actors jump between rooftops and sprint through sheets drying on clotheslines while a camera tracks them from above, hovering close enough to see their faces. The sequence, filmed in Bogotá for the coming Netflix drug-war series, “Narcos,” was too intimate to capture by helicopter and too intricate to choreograph easily from the ground. Cue the drone.”
There is also a video attached to the article that discusses different projects where drones have been used, from Music videos and movies to personal low budget projects with amazing outcomes.
Here is OK GO’s ‘I Won’t Let You Down’, a cool music video where drones were a key in the making of the project.
Though there are still health concerns about the dangers of these flying cameras around the general public, the possibilities for this new technology is endless. If the price range were a little lower, I’d be in line to buy one at this very moment, but until then I’ll enjoy the projects that come from drones, and see how other students are using them at school.
Methane Sensors, Cows and Air Pollution (Assignment 5)
May 7, 2016
Sensor Journalism is a medium of storytelling fairly new to me. In the Sensor Journalism Tow Center report, there are five case studies on how journalists used sensors to measure or sense something in the world to be featured in a research based story.
After reading these case studies and visiting the Spark Fun website with thousands of different sensors, I knew what would be an interesting variable to measure.
I went to Robert D. Bullard’s guest lecture , sponsored by the Geography department a few weeks ago where Dr. Bullard discussed his studies of air pollution/pollution in general, in relation to Geography. His findings were extremely interesting and I immediately thought of this while reading the Houston Chronicle case study from the Tow Center Report.
Bullard did case studies in Houston, TX, Flint, Michigan, small towns in Tennessee, and many others. All his findings lead to the fact that neighborhoods surrounding locations with bad air quality, water quality, or other forms of pollution, were full of minorities. He then went into the politics of neighborhoods and urban spaces, revealing remarkable data about how these communities often go unnoticed.
My idea for a sensor has to do with air pollution in relation to farming. After hearing Bullard’s lecture my interest was sparked in learning more about air pollution and how we can stop it. One interesting finding was that cattle farming contributes a significant amount of methane gas emitted in the air due to cows releasing large, concentrated amounts of gas, when they are corralled in large groups on cattle farms, “Ruminant livestock can produce 250 to 500 L of methane per day. This level of production results in estimates of the contribution by cattle to global warming that may occur in the next 50 to 100 yr to be a little less than 2%.”
One scholarly article I read about cows and methane emissions admitted that there are several factors that influence methane emissions from cows including, “level of feed intake, type of carbohydrate in the diet, feed processing, addition of lipids or ionophores to the diet, and alterations in the ruminal microflora.” If more farmers knew about the effects of their cattle in the atmosphere, and how they could stop/change it, we could really make a difference in the air quality in the next few years, “Manipulation of these factors can reduce methane emissions from cattle.”
The sensor I’m interested in using was the Methane CNG gas sensor. My study would involve teaming up with several groups of cattle farmers. I would then ask them to initially use the sensors to measure their current air quality, then I would have different groups change different factors in their cattling techniques to see how each change would effect the air quality. Finally, the change in cattling with the biggest difference made in air quality would conclude my studies. This would be very valuable to farmers who are interested in helping the environment any way they can. This study would be time consuming but I thing the end goal would definitely justify the means.
Scholarly Article citation:
Johnson, K. A., and D. E. Johnson. “Methane Emissions from Cattle.”Journal of Animal Science 73.8 (1995): n. pag. Alliance of Crop, Soll, and Environmental Science Societies (ACCESS). Web. 4 May 2016. <http://dx.doi.org//1995.7382483x>.
Other links from above;
Sparkfun Methane Gas Sensor, 4.95
Robert D. Bullard Lecture Announcement
Tow Center Sensor Journalism Report (Required Reading)
Assignment 2: My Microsoft HoloLens Experience
April 26, 2016
Last Tuesday I got the chance to try on the Microsoft HoloLens.
It’s a bit bulky looking and feeling, but I can only imagine that in time, it will become just as outdated as the box set television. Augmented reality has so much potential for changing and altering the way we interact as human beings. In this “Information Age” that we live in, staying connected is important for the types of jobs that we’re going to hold and the types of things we’re going to do with our interests as communications students. I used to constantly feel overwhelmed with all this new emerging technology because I felt like I didn’t know what I could do with it. I’ve always felt incomplete as just a consumer, but yet, I didn’t feel like I had the technical knowledge to understand the scope of technology like 360 video or augmented reality or virtual reality. I like to know how and why I’m going to use things before using them. However, I’m realizing through this class that no one really has it figured out yet anyway. The fact that I want to be a visual storyteller allows me to find purpose and reasoning for my tinkering with this technology. Prof. Ken Harper talked briefly about Chris Milk and his work with virtual reality immersion; I was inspired by his TedX Talk where Chris explains how interactive technology has the power to unite people. I want to discover this for myself.
Wearing the HoloLens was very mind-bending; I pinched and picked up puppies and elephants and rainbows and placed them around the room. Prof. Pacheco explained how the HoloLens scans the room and you can actually turn it off, leave the room, come back the next day, put it back on, and the same virtual “items” you placed around the room will stay in the same place. Fascinating!
I wasn’t a fan of the pinching gesture to click on things. As a digital native, I’m used to technology where pointing makes the most sense when trying to select things. Pinching makes me think of zooming in or out. However, I understand that it makes sense as it allows the HoloLens to capture the specific motion of two fingers touching together.
The headset was a little bit bulky, but the lens experience itself wasn’t dizzying or distracting at all. I still felt very present in the room with my classmates and could pay attention to what Prof. Pacheco was saying. The one thing that really blew me away was opening a web browser anywhere in the room and leaving it there. You can watch Youtube videos or check your email or chat on Facebook within the HoloLens if you browse the internet. The first thing that came to mind is the capability to edit a video literally anywhere. All you’d need is a HoloLens and a keyboard and you could sit on top of a mountain (with decent WiFi, of course) and edit a video. No longer do you have to be tied down to a desk and a monitor.
I’m fascinated by where we are with technology and am glad I got a chance to be one of first to try out the HoloLens. I wish I had taken a picture of myself using it, but alas, maybe in my next post.
When Journalism Students get access to the Microsoft HoloLens… (Assignment 2)
April 25, 2016
During the Week 2 lab, I was able to use the Microsoft HoloLens headset. We were some of the first students on campus to use the product because the headset was recently ordered and received in early April. This was one of the first times I had experienced augmented reality, other than with the Oculus Rift headset from Week 1’s lab, and it was amazing. In our first class we watched an advertisement video about the HoloLens, and after viewing the video, I was sold. Unlike many other “futuristic” technologies being created today, the HoloLens can do pretty much everything it’s advertised to do.
While using the HoloLens I was able to virtually place 3-D objects in the innovation lab. This was an interesting feature, and when using it I immediately thought of it being useful for designers who could be able to design their products digitally with the HoloLens. I think the best part about the HoloLens is that since it’s a Microsoft product, the interface is very easy to use, especially since I’m a PC user. I was able to watch YouTube videos, and go to different websites, just as a I would on my PC laptop.
Using the gestures made me flashback to scenes in the Iron Man movies when Tony Stark would gesture with his home computer system. Using the HoloLens is actually very similar to how the Jarvis technology was portrayed in the movie, especially in that you can view and change your designs virtually. Of course, it’s not as advanced as it is portrayed in the movie, but it seems we are getting close to technology similar to what was portrayed in the movie in the next few years. I can’t wait to see the new advancements as they develop in the future.
The only negative to this technology is the bulkiness of the headset but I’m sure with time and work, it will be scaled down to a more sleek design.
My career goals aren’t extremely clear for me yet, but as a sophomore dual Geography and Television major, I’m interested in creating content that will provide the world with information about the problems in the world, specifically in urban development. With this is mind, I feel the options with my career and the HoloLens are endless. In my geography courses we discuss different environmental problems and development issues everyday, and with the HoloLens we open up a new way of viewing a problem visually, which promotes a more creative outlook on problem solving.
Wearing the HoloLens is like having access to your own personal projecting device, but you’re projecting screen is anywhere! This provides a different mode of viewing television/video content. I can imagine a new industry of interactive media created for the HoloLens that will create a new experience for an audience.
Assignment 2: Post a blog entry about what you experienced with HoloLens. How do you think augmented and “mixed” reality might affect your career in the future?
New Technology Science Fiction Predictions in the Real World (Assignment 1)
April 12, 2016
Her was a science fiction/romantic comedy movie released in 2013 that was about a man’s relationship with an artificial intelligence software for companionship. This movie included multiple examples of new technology predictions throughout the movie.
In relation to our class, there was one specific scene in the movie where the main character was playing a game that was projected into his living room for a fully inclusive 3D simulated experience. In our second class when we met in the innovation lab, I immediately thought of that scene when we were playing the games on the 3D gaming software in the lab. This is a clear example of a technology prediction shown in a movie that has come into real life.
Here’s the scene here!
After looking into the contributors to the scene a little deeper, it turns out that the programmer of the version of the game shown in the movie, David O’Reilly, has decided to create the game in the real gaming market.
O’Reilly announced his plans for the game, Mountain, at the Horizon Indie Game Conference at E3, and ultimately released it in the indie game market in June 2014.
Here’s the game trailer:
Here’s the Mashable article where I found the information about O’Reilly’s game
As far as my field test, I am still unsure of where I want my focus to be. 3D gaming is very interesting to me but I don’t know how I can apply it to media. Hopefully after more research I will come up with my official field test idea.
Assignment 1 – Veronica Ortiz
April 12, 2016
Since playing the video game trilogy, Mass Effect, I’ve been fascinated with the possibilities of Artificial Intelligence. In the game, the spaceship’s AI, EDI, eventually imports its consciousness into a humanoid vessel and learns to manifest human-like qualities such as emotion, virtue, and humor.
Fortunately for me, the reality of artificial intelligence becoming as adaptive and relatable as EDI is a very tangible one. Artificial intelligence is an exponential trend that will keep evolving and growing at a very fast pace. Experts even estimate that computers will be just as intelligent as humans by 2035.
In the near future, AI will be adaptive, self-learning, and intuitive; it will be able to change its own rules and programming through its learned experiences.
The future of AI is approaching rapidly, but we don’t have to wait until 2035 to see some examples of it:
In class, I am most drawn to leaning how to use the 360 video rigs. For the field test, I would be interesting in seeing how the 360 video capabilities can be used to complement narrative storytelling.