Minkewicz – Field Test
May 10, 2017
For my field test I decided to use the technology we learned of the Facebook Bot. I’m going to attempt to create a bot that will give you up to date information and news on what past alums who have graduated from Newhouse are up to, where they worked before and how you can get in contact with them. For the reporters I also want to include links of some of their work and maybe social media accounts, and if the station they are working at has any job openings. I’m not sure this is even going to work and I have my doubts but I really want to give this a shot.
The main purpose for this robot is to help connect people looking to network for a possible job.
I haven’t tried embedding links to the robot so I think I’m going to start there. I think the easiest think would be embedding links from new websites and the bios of any alums that work there. That way it would have the ability to send to the person talking to it up to date information even if it can’t exactly to them what’s going on it self. Also from that point the user can navigate through those links and find what it wants in cases where the bot becomes on responsive.
I know there are Newhouse people all over the place so I think I’m going to start by looking up people who are working in Syracuse. For time purposes and because there’s a lot of bios to look into, I’m keeping it to Channel 9, CNY Central and Spectrum News. I might throw in the radio stations but I’m not sure yet.
After about an hour of going through people’s bios I finally got through the whole list! I knew there were Newhouse people working in Syracuse, but I found a lot more than I thought I would. I probably pulled about 20 names for those 3 different stations.
Just finished adding in everyone’s bios that I found. I also threw in some normal conversation ques to make the robot a little more interactive. Right now I think it’s a good time to test out the bot and see how others respond to it and if I need to make some more adjustments.
One down side to this Bot that I didn’t consider until now is that it’s going to have to be updated every few months because people might not work at those stations anymore. Another downside I found while creating this bot is that although I list what reporters work where, I did have info for the people that work behind the scenes, such as producers.
Over all I think the bot serves it’s function on a simple level. If anything it takes the time away from having to look at all the bios of the reporters in the area to see which ones graduated from Newhouse. Adding their individual contact information was a lot more difficult then I thought it would be. For the people that tested out the bot it just didn’t seem to work out too well. Perhaps it was information over load. I might consider working on this some more because I think it does have great potential of being a resource for people looking to reach out to Newhouse people and make that networking connection.
Here’s the link to the Bot for you to check out!
Minkewicz Vision Paper
May 10, 2017
New Tech for New Media final paper
The Rise of Alexa 3000
It’s the year 2050 and if you think you could pick up a newspaper and coffee and be on your way you’re wrong. Not that you would mind much anyways because people haven’t been reading on paper for quite some time now. Besides was it ever really efficient to carry both your coffee and a paper while walking and reading? Probably not.
Every one and everything uses technology and with Alexa 3000 that’s all you need.
The first model came out a few years ago, and at first I was unsure if I wanted to make the investment in one. Amazon was still working out some technical problems with the model, and to be honest it was a little out of my price range. Total cost of $1,000 and you had to pay an annual fine of $150 a year for software updates. It all seemed excessive to me. However, more and more people were purchasing them. So I made the investment.
Morning routines are typically planned out with a specific schedule. Watching or reading news stories in the morning can help people catch up with the latest stories and add talking points throughout the day, but the opportunity to sit down to absorb the news is something that busy Americans cannot afford and this is where technology comes in.
Contrary to the iconic, traditional image of the average American family sitting down to eat breakfast together, mothers and fathers are busier than ever and in this day and age unrealistic. Morning is not the time of day when people browse through the newspaper or have a living room discussion about current events; morning is an overload.
Responsive AI devices, such as Amazon’s Alexa 300, brings the news back into one’s morning with no real delays or changes to the typical morning routine, and I’ll talk about the consequences of this a little later.
These devices can load and display top stories and current events with a simple, “Show me today’s news” command. Amazon’s Alexa 3000 can even play audio recordings of news stories or live streams of some news services.
Convenience is key with Al, and these devices are certainly convenient for the hectic schedules of most Americans.
So your probably wondering how any of this is even possible? Well, Alexa 3000 uses sensors to track EVERYTHING. What people are saying and when they’re saying it, and what news is going on the second it happens. Eliminating any news job that ever was. We no longer turn on the T.V. if a natural disaster occurs you can turn to Alexa and ask her what’s going on and she will give you a full report right then and there. Most importantly it’s accurate. With all this debate about fake news, Amazon thought it was important to invest in technology that would eliminate this stigma that comes around with sharing information. They nailed it. Unfortunately they took my job out in that process. Did I forget to mention that’s why I didn’t want to invest in this technology?
I never thought a piece of technology would be able to effectively tell stories the way newsmakers can. After all, a T.V. reporter has to interview, shoot and write most news stories. But this technology allows Alexa 3000 to already see everything that’s happening and capture it. The technology is even able to get interviews with those involved in the events and stories, most of whom are willing to talk anyways. Many of them expressed relief in talking to a piece of technology then a person, because like almost everyone, they also owned an Alexa 3000.
As I mentioned before, this technology provides a service that people can rely on for information. However, relying on that convenience impacts journalism in traditional media sources. If someone chooses to consume news stories through an AI service in the morning, that individual may decide not to turn on the local news station after returning from work. Moreover, continued reliance on an AI device’s service may breed familiarity and fondness for that specific style. Similar to how people may grow to love their station’s news anchors and reporters, Alexa 3000 is now that fond, familiar voice–ready to report on the latest news at any moment the user requests it. If you didn’t grow up with this type of technology, you might find yourself like me and think this whole thing is weird. However, it’s now the norm and something even I have to get used to.
Let’s switch gears and take a little bit about how this technology functions. So when you want to know something you basically ask Alexa 3000 and she projects all the information for you in a virtual screen that you can interact with. Here you’re able to read articles, watch news stories, and see what others on social media are saying. We no longer need the service of a reporter or journalist, and because of that those jobs and mine are no more.
I’ve considered working for the giant corporation Amazon, but I’m so bitter about how everything turned out I’m not sure I can. Plus I would have to go back to school and learn a whole new skill set. I’m torn.
So here I am jobless, and clueless. Others have found the transition to be a smooth one, Mainly those working in the government and technology. Those two go hand in hand nowadays. There are even laws being passed that force you to own an Alexa 3000 otherwise you can face multiple fines and jail time until you do. One for each household, that’s the rule. Crazy to think that something that started as a feature to help male like become more simple would turn into something that’s more involved in your life than anything else. The argument is that it makes life easier, and yes it does, but when is it too much? To have technology listening to your every word and following your every move and not having control of that is toomuch in my opinion. There’s even an Alexa 3000 in public bathrooms!! I mean come on that’s over kill. I mean it knows when you’re in the bathroom at home so I suppose it’s not different. What information can you possible gain and use from that? To each their own I guess.
The other day I was having a conversation with my mother about why she favors Alexa 3000 and she made some valid points. She needs help getting ready in the morning, and because Alexa 3000 is a full on person pretty much, it’s able to help my mom get dressed and ready for the day. It also helps my mom with grocery shopping and balancing her checkbook, weird concept right? These small tasks take a lot of pressure off of my mom who sometimes can’t remember where she left her car keys. Oh did I mention Alexa 3000 drives my mom to the store as well. I guess without this technology my mom would have to relay on her children and family members to help her with these basic needs. This is something her children can’t do for her because everyone of her five children live in different countries and aren’t able to move because of their jobs. That I can get and I agreed with her, and just for the record I was in town visiting so don’t think of me as someone who can’t help my own mother… Even though now that I’m unemployed I can… Alright maybe I should rethink some life choices. All in all though she’s pretty comfortable with her set up and I think at this point she’s very used to it. I could always ask her if she’d want my help but I think she’s turn my offer down giving me some reason that the technology is more efficient and can do anything I can do better and faster. She’s most likely right about that. Even my own Alexa 3000 runs my errands that I either don’t want to do or find I don’t have enough time to do myself. Who am I to argue with my mom about something that even I do myself? Seems kind of pointless because with all the points she made, she’s very much right. We are all reliable on this technology to get us through the day and without it we would feel completely lost.
I can see this technology continuing to take over more and more careers. As well as completing every day tasks effortlessly. While I see the benefits to investing in this technology I can’t help but miss those days where we as people were more independent and worked for many of the things we wanted. Where we took the time to research what was going on in the world and come up with our own opinions and conclusions. Not have a piece of machinery constantly feed us information that we have to take at face value because we don’t have any other choice. We were once thinkers and innovators and go doers and that’s been washed down quite a bit. Is efficiency worth it if it means sacrificing those basic human instincts to be curious and learn more? I don’t think so.
This concludes my rant. I’m going to go job hunt now.
Vision Paper – Elena DeLuccia
May 10, 2017
Sleeping used to be so easy before the project. It’s supposed to change everything. I didn’t know what to expect when I signed on; most of us just needed the money – I know I did. It sounded like a luxury, too. All this power at the blink of an eye, literally. It seems, however, that we might have taken on a bit more than we were capable of.
It’s been about a month since I became a part of the project. They offered a bi-monthly pay of $11,000 to a dozen individuals who had the proper amounts of time, intelligence and stability. We will get our pay once we’ve spent two months following the instructions to a tea, and reporting all of our experiences. Other than that, we simply live our lives. I haven’t seen the others since we got implanted. We met, talked about how intrigued we were, danced around our hesitance, and exchanged contact information, in case the manual confused us. They told us that the manual will get smaller the more reports the test subjects give. We’ll be the ones to figure out the tricks to making it work. The manual now consists of pages upon pages of the neurological make-up of each section of the brain. Which neurons are supposed to fire where, what they mean, why they might do the things they do, and how we know it all. After this month, I’ll have only scanned a few pages at random. There are no guides in the manual about what to do when your intrusive thoughts kick in and lead you to videos or articles of someone driving off a bridge, or punching someone in an office.
Every thought needs to be controlled lest the computer catch on to that specific thought and bring you to some obscure part of the internet. I’ve unintentionally watched countless videos of people on bicycles crashing into things because when I see someone riding a bike I think: Hey, what if they just hit a patch of sand and ate shit. Lo-and-behold, hundreds of videos, articles, pictures, of hysterical and/or horrifying bicycle accidents. There are also, of course, the contacts that pop up. Aunt Sandy, Shithead, Patches Barber Shop. Aunt Sandy is the hardest to get off the phone. Shithead is an old boyfriend of mine whose number I was never ready to get rid of. He doesn’t ask too many questions when I accidentally call him since I’ve done it a fair few times. The first time I ever called him was the day I got home from the implantation. I explained to him the project so he wouldn’t think I was just the creep who calls their ex-boyfriend two years after last speaking. I had been spreading mustard on a sandwich wondering how much mustard a person would have to eat before their shit turned yellow. Then I thought about how disgusting it was that I always have involuntary thoughts about shit running around my head, following which, the ringing began. I reached to my back pocket for my phone which wasn’t there anymore. They took everything I had on my phone and transferred it to the chip, so it would feel the same: photos, videos, saved web pages, games, contacts, apps. I still technically owned a phone, it was just in my head. The ringing continued and I realized I had made a phone call, not knowing to whom.
“Hello? Hello? Devin? Why did you call me? Hello?” He took no time between each question, breathlessly moving from one to the next. I rolled my eyes and hoped that this wouldn’t trigger some Google search for ‘irritated looks’ or ‘sarcastic responses’. His voice sounded from the inside of my ears. It felt as though someone was whispering into my ear from behind, the sound circulating like wind through a tunnel deep within in my ear canal; I almost felt the sensation of breath down my neck, but there was no one around. What’s most unsettling was that I could have just as easily been having a conversation with myself. I had to constantly remind myself that he was real, and hearing me too.
“Yeah, um, hi. Sorry I didn’t mean to dial you.” No one said anything for a moment.
“The last time I checked you had me in your phone as ‘shithead’. How do you ‘accidentally’ dial shithead?”
“It’s just this whole thing. I’m sorry. I’ll go.”
Hang up, I thought, but I still heard the buzz of sound from the other line. Hang up. I thought louder, but still the buzz, and a few quiet breaths. I wondered why he hadn’t hung up yet. I began to blink ferociously and shake my head back and forth to get the phone call to end.
“Dev? Are you still there?”
“Sorry yeah. I don’t know how to hang up so can you please?”
And so began the conversation where I explained to him everything. How I was selected out of hundreds of applicants to have a chip inserted into my brain that would pick up the electrical impulses from my neurons as they fired. The chip has all the capabilities of a computer; I, and the 11 other people selected, simply had to learn how to control it. The research goal is to get our thoughts organized enough to control an entire computer with our brains, thus giving us limitless knowledge and constant access to the rest of the world. Someday, the chip will hopefully sell with the ferocity that the iPhone does.
We talked about how he was my first phone call. How it was triggered by an uncontrolled thought about shit. He asked me to tell him what the 62nd number of pi was and what the capital of Uzbekistan is, two things he was positive I didn’t have in my repertoire of knowledge. It didn’t take long for me to figure out how to conduct a proper Google search in my head. Some thoughts are louder than others. I can close my eyes and see that the capital of Uzbekistan is Tashkent, and its neighboring cities are Urtaaul to the west, and Yalangach to the northeast. I’ve even mastered conducting these searches with my eyes open, although I haven’t yet learned how to focus on the person I’m talking to and conduct a search at the same time. Everyone now knows that I’m scrolling through Facebook or Twitter when I check out of a conversation. Last week I visited my parents to update them on the progress I was making with the chip. My dad raved about the government’s involvement in my thoughts and how this was the stupidest decision I had ever made. My mom still wants to talk to me about work and how my life is going. While my mother tries to explain to me the plot of the most recent book she’s read and my father tries to talk over her about the government knowing when I’m watching porn, I find something else to do.
“… but only in the third act of the book does she realize that he was poisoning their daughter all along and that’s why he stayed…”
“… every time you’re doing nasty things to yourself the government knows it because they’re probably off somewhere checking your temperature and sperm count…”
I’m already logged in. I can still hear my parents talking, but it’s just like looking down at your phone and honing-in on whatever you have pulled up on your screen. It’s easier to pay attention to that.
“Nadine just give me a second to talk to the boy.”
“He’s heard it all before Alan just let me tell him about my damn book.”
I’m staring intently into my mother’s eyes as she talks giving her the idea that I’m engaged in her words but I’m looking at the most recent engagement picture on new news feed. Samantha Litto got engaged to this huge man with a perfect beer belly despite his young age of 27. The picture is of them on a boat standing next to each other, him pointing to her hand and her holding out her ring finger prominently. The bottom of her fiancé’s hairy belly is poking out of his t-shirt and I’m disgusted by it.
“I know, terrible, right? Guts everywhere and not a clue who did it, but I know it’s going to be the step father, obviously.” My mom continues, reading the disgusted look on my face as a reaction to her story. I tune back into her but the photo is still in the fore front of my mind and I have trouble getting it to close. My mother keeps talking as I mentally track the mouse to the ‘x’ at the top of the screen. Unfortunately for me, my mother gasps, remembering a new bit of information from the book, causing me to click on a video. The volume is already up way too loud, as I last had audio playing during the shower, where I need it turned up.
“AGH” I cringe and clutch my ears; my parents start to talk to me in a panic and I put my head between my legs with my hands still up at my ears, which doesn’t help at all, by the way. With my eyes closed though, I can concentrate. I can focus not on my environment but on getting the mouse to click ‘pause’, and then to move it to reduce the screen. The song stops and my surroundings become clear instantly. I slowly take my hands off my ears and my parents are dead silent. All I can hear now is the sound of the dishwasher running and leaves rustling outside of the glass door at the back of the dining room.
Slowly looking up, I see my father standing there, arms crossed and smug. My mother’s eyes are wide and she is clearly coming down from a moment of sheer panic watching me settle myself. I take a breath and look at them. My mom speaks up:
“Was that the government?”
Field test Tony Yao
May 10, 2017
As for my field test, I wanted to know whether amazon voice recognition system can actually recognize real human voice instead of some computer generated voice. Let’s see what it happens when it heard of computer generated voices
What I wanted to know is that is machine-generated voices can be recognized and how accurate can they be. To comparison, I have chosen XXX types of computer generated voice systems. Which include:
Also, by recognizing “Alexa” is not enough for this test. I will also try to make it understand longer sentences like “what’s the weather today?” or “Do you know how to make a chocolate brownie?” To see if machine generated voice (which are obvious to us that it is fraud) can be detected by Amazon Echo.
During the test, I realize that if the sentence is too short. Alexa may not be able to recognize the sentence. That’s why I added “Errr” between “Alexa”(the trigger word) and the sentence itself.
I will use my own voice to get an answer so that if Alexa can recognize it, it will reply with the same answer.
Human Voice saying “Alexa, What’s the weather today” and “Alexa, Do you know how to make a chocolate brownie?”
Amazon echo is able to recognize google translate voices at all time.
Amazon echo can recognize What’s the weather today but not able to recognize a longer sentence.
Amazon echo is able to recognize Oddcast.com voices at all time.
Amazon echo is able to recognize Acapela group voices at all time.
The more human-like voice will have a better chance to fool Amazon Echo system that it is actually a person talking to it. A more machine like sentence can trigger Echo but it is not able to recognize the whole sentence.
I did this field test was totally for fun at the same time. Amazon Echo is a good helper to our everyday life. However, I also recognize some of the safety issues this might generate. If a human can fake his voice and get recognized by Amazon Echo, then he or she may have the ability to fake his presence at home even though he is not actually there. This kind of security issues can be serious if Amazon Echo can’t be able to tell who is actually talking to it.
Voice recognition system still has a long way to go, but I believe that human intelligence will solve all that problems and bring true convenience to our life.
Edwards- Assignment 5
May 9, 2017
One of the sensors on sparkfun is the alcohol gas sensor: https://www.sparkfun.com/products/8880. It operates like a breathalyzer in that it can calculate someone’s BAC based on their breath. This particular sensor has high sensitivity and fast response time. In a journalism story, you could get permission from a police department to put these in the back of police cars. Since they are tiny, they would be easy to disguise in the back of the car. Then, you could measure the correlation of arrests with alcohol. How many of those arrested were drinking prior to their arrest? Does the severity of the alleged crime have anything to do with amount of alcohol consumption? I think these results could be interesting.
Field Test – Elena DeLuccia
May 8, 2017
I had an interesting time with my field test! I originally planned to go to Ithaca’s gorges and get a one-minute 360-degree video at each, but with the very, very limited time I’ve had with finals, projects, my job, etc. it just didn’t work out. So, instead I decided to take the Nikon KeyMission 360 to New York City with me on my apartment search and make a quick stop at my favorite place: Lincoln Center. I wanted to go on top of the awesome grass field that rests above one of the buildings because that’s absolutely the best place, but it wasn’t open for the season just yet so I stayed in the main plaza.
In New York City, there are no tripods allowed without a permit. I tried to swing it but got caught by the cop that roams the premises – and I have the 360 video to prove it. I quickly decided to leave the camera resting on the fountain (under careful watch) in order to get a shot of the entire plaza. I know that height is a huge consideration when doing 360 video, so it’s unfortunate that I had to leave the camera at a child’s height, although I still think it gives you a cool look at the plaza.
I had a little trouble getting the camera to agree with me and record, but after a few tries I finally got it to work. I have two full takes that worked really well, but I just uploaded the first one since the plaza looks a little better.
When I got back home, I uploaded the video into Nikon’s KeyMission Edit Application that I downloaded for free on my Macbook. I actually edited the sequence down on there and since it had a function for saving the video in the correct format for YouTube, I uploaded right after. I decided not to add any background music because I wanted the audience to hear the natural sounds in the city to feel like they’re there, on 65th, sitting on the fountain with people surrounding them.
Vision paper Tony Yao
May 7, 2017
A Journalist’s dairy
Another ‘new’ day of my life.
My supervisor Johnny told me to interview some of the handmake cloth companies in China. Come on, who knows about those hand-make clothes anymore. After the launch of Amazon look around 5 years ago, people has long been controlled by this ‘fancy’ clothing guide machine.
Human kind is becoming lazier and lazier. All those convenient AR stuff has cut all human’s ability to use their mind to a lowest level ever! Nobody is trying to find a cloth at a store, they all ordered online and try them out with a fancy AR mirror. No more searching, all you have to do is ask Alexa and everything is done. What a lazy world it is!
Luckily, there are still somebody working on traditional clothing art. The couple I interviewed today never sell their cloth on a fancy clothing website. Their opinion is customers should try before buying a new cloth because they can’t feel the cloth when they buy them online.
I am so worried that one day human may lose their ability to decide and feel the world due to these fancy technologies but they are convenient for sure. Oh, shit, Alexa is warning me to sleep right now. To have a health body, these voice recognition system is doing some good.
Today, our last camera man quit his job and our company became one of the “auto” tv channels in this country. This guy, Egzon, has been a single camera man for the past year and we had some great footages. The problem is, we don’t really need a camera man anymore. Egzon gets paid so few that even he is a single guy without a greedy living style can’t live on this salary anymore.
The problem comes four to five years ago when wearable and portable cameras gone viral and even a 5-years kid can take a good picture and great shoots of video without even learning. All those drones and automatic staff fulfill people’s need for a memory saving machine but also cut out people’s passion on actually get some complex picture taking done.
This goes worse when amazon got their first ever auto wearable 360 cameras online. With the capability of shooting all around you with a 16k definition, there is no need to bring a camera with you anymore. All pictures and videos can be shoot automatically and it is impossible to make any human mistakes such as missing the scene or losing focus. To a TV company like us, a single investment in a machine is much better to keep all those camera staff working.
Egzon is a good guy. He is just not born in a good time when you can still make money with you excellent filming skills. May be he can still make this one of his hobbies but there is no space for him in the company.
What a shame.
P.S. May be one day I will also be cut out because they might no longer need a journalist anymore. Who knows.
I can’t remember when is the last time I have to drive a car myself.
Self-driving cars are easy to use, much safer than human drivers and helps to prevent traffic jams. With these technology boost 2 years ago. I no longer drive any more. Am I kind of missing the passion and joy when I can drive myself?
I interviewed a car manufacture today and they are saying that next year they might cut the human emergency driving function out. Jimmy, the CEO of VW, announce that with great excitement.
“We are safer than ever!”
What about the joy once we have?
Driving a car was so much fun but it is illegal right now to prevent car crashes and traffic jams. With all those sensors input and connected to cloud, the traffic is certainly more liquid but getting from one place to another is lacking fun.
I can still remember the days when I can feel the vibration of my steering wheel. I can still remember the passion when I turn into corners. I can still remember the joy when I first change my brakes.
Now it’s all gone.
No more human drivers- as the law says.
The only “human” driver you can see is from racing competitions while they are not actually driving anymore while sitting in a small cockpit and remotely control their cars. What’s the point!
The technology is developing with the cost of our joyfulness, so what’s the point of that?
Worst birthday ever.
I lost my job because the company decide to use drones to took our jobs. All they need is 5 people sitting in the room all day to cover all the news online and interview others using drones.
It is not unpredictable.
Last time when I was attending a government conference, they all use drones. All this big news rooms, I am the only human. The secretary of the home security didn’t show up. She sent her drones. All the major tv companies didn’t show up, they sent their drones.
That was a terrible experience. All you can hear is the spin of fans and you are the only one who has to get paper based materials.
I love my job and I think that there is a purpose to get face to face with another people and hear from them what they truly think and their facial expressions are the most satisfactory moment during each conversation.
I feel alive.
But humanity can’t beat money. Well, in each ways. AI is taking control. They are much cheaper and they don’t make mistakes.
Guess it is time for us to leave the stage.
Or the human kind as well?
Seven days after this dairy.
AI take over and erase human kind.
NMNT Field Test Yi Zhang
May 6, 2017
360 Video Tour for Prospective Photography Student
I want to experience shooting with 360 camera. So, I choose to create a 360 video tour for prospective photography student. 360 video is very good at demonstrating a space. As prospective students, they must be very eager to know the environment they will be in when they come here. I used 360 video to demonstrate two places that photography students come to very often and showed the advantage of the photography program here.
I used the Samsung 360 camera and a stand to shoot. I shot each clip for about 1 minute. After stitching the video in Gear 360, I edited it in Premiere, added background music and voice over. I also used the off set effect to adjust the start frame of clips to what I want viewers to see first. It turned out that my last clip of studio was skewed. I might not set the camera straight while shooting. Except for that, I think other things in this video work well.
Through this field test, I got hands-on experience of producing 360 video. Most things for people to care when producing 360 video are the same as producing regular videos. But we need to consider something new too, such as whether to include yourself or not in the 360 degrees space, which angle you want viewers to see first, what scenes are suitable for using 360 to show, special effects for 360 video in Premiere, etc.
I’m also taking a course named Sound for Picture this semester. It makes to think that if people use 360 video to produce movies, sound design will be a challenge. Sound designer need to create the surround effect to match different perspectives of the 360 video. It will be a great chance to get people deeply involved into the movie. I’m very looking forward to watching that kind of movie one day in the future.
NTNM Assignment 5 Sensor: Yi Zhang
May 6, 2017
The SparkFun Soil Moisture Sensor is a simple breakout for measuring the moisture in soil and similar materials. The soil moisture sensor is pretty straight forward to use. The two large exposed pads function as probes for the sensor, together acting as a variable resistor.
It can be used by journalists to collect data for stories about agriculture or weather. Also, it can be connected to an auto watering system. It helps to determine when and how much to water and may help save water and energy. It can be used at home or for the plants in public area. In addition, I think it may be able to used in food processing as well. Some food need certain level of moisture to be right. The moisture sensor can supervise that.
NTNM Assignment 5 Tony Yao
May 6, 2017
In my opinion, a lot of sensors can be used as tools for outdoor survival uses. Temperature sensors can tell whether you should put on a fire or take off extra clothes. The weather meters can be used to predict extreme weathers in order to keep you safe. Flex sensors can be placed into you tent so that when its flexibility reach certain extent it can warn you about possible danger. Heart rate sensors can be used for health condition check and send signals when necessary. Force sensors can be used as alarm systems to avoid deadly wild animals. pH sensor kit can be used for water supply.
These sensors on Sparkfun are extremely small so that it is possible to combine them together for an ultimate survival tool that is reliable and life-saving.