Drones have received a lot of attention in the American media recently for a variety of reasons. For the avid Instagram user, they have become an exotic enhancement to their picture taking arsenal. For many others, they are much more significant as a new dimension of modern warfare. In warfare, two types of drones are typical. One is simpler while the other is more complex, and from a computer science standpoint, more intriguing.
A diagram depicting the relationship between the operator and the drone.
At their core, most UAVs (unmanned aerial vehicles) are quite similar to a remote control car with which kids might play. The engine of the vehicle is controlled entirely by inputs from a separate remote control. From this remote control, a user can have the UAV change its speed or direction, use it for reconnaissance purposes such as photography, or engage the drone in physical warfare. Historically, UAVs were practically implemented first by the Israeli army in the 1970s, then the Iranian army in the 1980s, and subsequently the U.S. army in the Gulf War in the 1990s. At that point, UAVs were used for reconnaissance purposes or as decoys. The first kill by a drone occurred in October of 2001, and since then, have occurred with increasing frequency. This has been a controversial tactic employed by the American armed forces.
An unarmed aerial vehicle firing a rocket.
Drones can have varying levels of autonomy, however. Many perform almost all functions under the guide of an operator, but have the ability to perform a function such as "return to base" by themselves. Others have increased capabilities, relying on receptors of the world around them to inform them of how to act. The drones are able to operate in this way by relying on a large amount of loops, from algorithms calculating the most efficient way to travel in regards to fuel and time to constructing an actual trajectory to travel from one location to another. Fully autonomous UAVs are said to be entirely cognizant of their surroundings and capable of total independence in terms of decision making. While fascinating from a programming standpoint, the political and ethical side of autonomous drone warfare has limited its implementation. Also, increased fear of malfunction or hacking has left decision makers wary of releasing autonomous UAVs in full force. The capabilities of these machines are nonetheless astounding and will certainly factor into the engineering landscape of the future.
https://en.wikipedia.org/wiki/Unmanned_aerial_vehicle#/media/File:Autonomous_control_basics.jpg
https://en.wikipedia.org/wiki/Unmanned_combat_aerial_vehicle
https://en.wikipedia.org/wiki/Unmanned_aerial_vehicle
http://www.globalexchange.org/blogs/peopletopeople/tag/drone-warfare/
Friday, September 30, 2016
Friday, September 23, 2016
Alan Turing
While I initially learned of Alan Turing through the movie "The Imitation Game," I came to appreciate his influence on computer science once I started taking this class. By learning just the simple functions that go into programming, I came to understand how difficult it must be to create a programming language, let alone the very first one with no precedent off of which to work. Turing's original computer, used to break codes written by the Nazi army during World War II, ended up being the basis for far more developments in the world of computer science.
The wheels used to compute solutions on the Collosus computer.
Turing worked for the British army during World War II, developing the Collosus computer, which was used to break German codes. The Collosus is understood to be the first programmable computer. By studying a lapse in German coding, the British programmers, including Turing, were able to understand that they could use the equation ∆Z1 ⊕ ∆Z2 ⊕ ∆1 ⊕ ∆2 = • , counting the number of "false" interpretations returned, to decipher the code.
Turing was initially discredited for his efforts due to discriminatory homophobia. He was, however, later acknowledged by the British government for not only his contributions to World War II, but the field of computer science in general.
https://en.wikipedia.org/wiki/Alan_Turing#Early_computers_and_the_Turing_test
https://en.wikipedia.org/wiki/Colossus_computer
https://en.wikipedia.org/wiki/Colossus_computer#/media/File:SZ42-6-wheels-lightened.jpg
The wheels used to compute solutions on the Collosus computer.
Turing worked for the British army during World War II, developing the Collosus computer, which was used to break German codes. The Collosus is understood to be the first programmable computer. By studying a lapse in German coding, the British programmers, including Turing, were able to understand that they could use the equation ∆Z1 ⊕ ∆Z2 ⊕ ∆1 ⊕ ∆2 = • , counting the number of "false" interpretations returned, to decipher the code.
Turing was initially discredited for his efforts due to discriminatory homophobia. He was, however, later acknowledged by the British government for not only his contributions to World War II, but the field of computer science in general.
https://en.wikipedia.org/wiki/Alan_Turing#Early_computers_and_the_Turing_test
https://en.wikipedia.org/wiki/Colossus_computer
https://en.wikipedia.org/wiki/Colossus_computer#/media/File:SZ42-6-wheels-lightened.jpg
Friday, September 16, 2016
Magnetic Key Cards
Magnetic key cards are ubiquitous in the modern world, with their technology applied to everything from credit cards to hotel keys. Considering our University of Richmond IDs use this same technology for so many things we do on a daily basis, I decided to look into the technology behind key cards.
https://en.wikipedia.org/wiki/Magnetic_stripe_card
The black strip visible on most magnetic key cards is just the surface of a tiny metallic strip on which magnetically charged iron particles are arranged such that they store data. Due to the microscopic nature of these charged arrangements, there are billions of possible combinations. What this means practically is that each time we swipe our UR IDs, or credit cards, or anything else with a magnetic strip, the reader is analyzing the pattern of the strip. Then, it will use that information to determine what access should be granted, or in the case of credit cards, what account should be charged. It is impressive to think that just the action of swiping a card informs a machine of who you are and what following actions should be taken, but that is the ability of this technology.
A magnetic sequence with its own meaning.
Magnetic strips can be produced in two levels of coercivity, high and low. While low coercivity magstrips are less expensive to produce, they also have a much shorter lifespan and can be more easily erased, even by interaction with other magnetic items. These are typically only used for gift cards, season passes, and other such short term needs. Longer term cards, such as credit cards or debit cards, require high coercivity magstrips. These use more magnetic energy to encode, thus making them more difficult to erase. Interestingly, high coercivity strips show up as black on cards, while low coercivity strips tend to be light brown.
A key card being read by a reader.
Recently, common magnetic strip key cards appear to be getting phased out. Many more tech savvy companies and consumers are turning to electronic alternatives such as Apple Pay. This technology does essentially the same thing, reading a user's specifically encoded information in order to decide what to do, but with fewer pieces of hardware. Also, increased abilities from identity thieves and hackers have rendered common magstrips more dangerous. Many magstrip readers can be outfitted to send the information input to recipients beyond just the company receiving the payment. Thus, the trend seems to be to favor chip readers, which are more difficult to breach. Regardless, the magnetic key card has been a huge part of our lives as the world has become more technological.
https://en.wikipedia.org/wiki/Magnetic_stripe_card#/media/File:Aufnahme_der_magnetischen_Struktur_eines_Magnetstreifens_auf_eine_EC-Karte_(Aufnahme_mit_CMOS-MagView)2.jpg
http://www.moosekeycard.com/price.html
Thursday, September 8, 2016
3d Touch on the iPhone 7
Tech giant Apple recently released its latest iteration of the iPhone, its wildly popular smartphone brand. As expected, the company tweaked many parts of the phone's design, from its headphone jack to the interior processing units to the camera, and, as I will focus on in this journal entry, its home button.
Apple's 3d touch being used on the iPhone 6s.
Apple has used a technology that they call "3d touch" in the past on the touchscreen part of the phone. What this allows the user to do is push harder on the screen to unlock additional outputs than just selecting whatever may be on the screen at that time. Different amounts of pressure can trigger previews of an output, menu shortcuts, and quicker actions than usual. On the iPhone 7, however, Apple is applying this technology to its home "button." While this has previously been a physical button that can be engaged for a number of purposes, the newest edition is a button only in name. Instead of operating mechanically, the new button receives a user's pressure on the button not with a tangible movement, but with an internal response. A major benefit of this advancement appears to be increased durability as well as compatibility with Apple's desire to make the iPhone water resistant.
The iPhone 7 shown from the bottom.
Perhaps most interesting is that "Third-party companies will also be able to program their own feedback through a taptic engine API" (The Verge). This means that the myriad apps downloaded on iPhones will be incorporating another input option. From a programming standpoint, this gives app developers and Apple's own engineers many exciting new directions to explore by just increasing the variety of inputs. Overall, replacing a mechanical button with a 3d touch button demonstrates how Apple's development teams are overcoming obstacles introduced by oft-failed physically triggered home buttons. As they have with all of their product lines in the past, the tech giant is attempting to quell an issue while progressing their technology towards the future.
Sources:
http://appleapple.top/iphone-7-touchscreen-button-force-touch-id-will-accurately-simulate-the-usual-clicks/
http://www.slashgear.com/iphone-7-home-button-how-does-it-work-07455109/
http://appleinsider.com/articles/15/09/10/force-touch-gets-redefined-in-the-iphone-6s-with-3d-touch
http://www.theverge.com/circuitbreaker/2016/9/7/12828652/apple-iphone-7-home-button-removed-force-touch
http://www.businessinsider.com/apple-3d-touch-for-iphone-2015-9
Tech giant Apple recently released its latest iteration of the iPhone, its wildly popular smartphone brand. As expected, the company tweaked many parts of the phone's design, from its headphone jack to the interior processing units to the camera, and, as I will focus on in this journal entry, its home button.
Apple's 3d touch being used on the iPhone 6s.
Apple has used a technology that they call "3d touch" in the past on the touchscreen part of the phone. What this allows the user to do is push harder on the screen to unlock additional outputs than just selecting whatever may be on the screen at that time. Different amounts of pressure can trigger previews of an output, menu shortcuts, and quicker actions than usual. On the iPhone 7, however, Apple is applying this technology to its home "button." While this has previously been a physical button that can be engaged for a number of purposes, the newest edition is a button only in name. Instead of operating mechanically, the new button receives a user's pressure on the button not with a tangible movement, but with an internal response. A major benefit of this advancement appears to be increased durability as well as compatibility with Apple's desire to make the iPhone water resistant.
The iPhone 7 shown from the bottom.
Perhaps most interesting is that "Third-party companies will also be able to program their own feedback through a taptic engine API" (The Verge). This means that the myriad apps downloaded on iPhones will be incorporating another input option. From a programming standpoint, this gives app developers and Apple's own engineers many exciting new directions to explore by just increasing the variety of inputs. Overall, replacing a mechanical button with a 3d touch button demonstrates how Apple's development teams are overcoming obstacles introduced by oft-failed physically triggered home buttons. As they have with all of their product lines in the past, the tech giant is attempting to quell an issue while progressing their technology towards the future.
Sources:
http://appleapple.top/iphone-7-touchscreen-button-force-touch-id-will-accurately-simulate-the-usual-clicks/
http://www.slashgear.com/iphone-7-home-button-how-does-it-work-07455109/
http://appleinsider.com/articles/15/09/10/force-touch-gets-redefined-in-the-iphone-6s-with-3d-touch
http://www.theverge.com/circuitbreaker/2016/9/7/12828652/apple-iphone-7-home-button-removed-force-touch
http://www.businessinsider.com/apple-3d-touch-for-iphone-2015-9
Thursday, September 1, 2016
CGI in Filming
I was inspired to write about computer-generated imagery, or CGI, after watching the following video (definitely worth the watch if you are a fan) on the Game of Thrones episode "Battle of the Bastards." It focuses mainly on how producers could augment the scenes to drastically increase the scope of the shot from a few dozen horses and actors to full scale armies.
https://vimeo.com/172374044
CGI is usually used to create 3D images, although it can also be applied in 2D formats. I was surprised to see that even the backgrounds of many scenes shot with CGI that I anticipated would be authentic places were actually created with algorithms. Using these strategies, programmers can manipulate a blank canvas into a realistic topography. They are able to achieve this authenticity by coding in midpoint formulas and and meshing surfaces together.
An example of CGI damp fur.
There has also been a significant amount of effort put into creating realistic images of skin, cloth, and fur. Programmers have struggled to imitate the natural reactions of the materials to movement, but accuracy in their portrayal can be achieved down to 0.1 millimeters. Another application of CGI is in user-impacted formats such as flight simulators. These programs have to be designed so that functions performed by the user are accurately portrayed in the visualization.
One of the main fallbacks for CGI is its cost, both in time and money. Not only is it incredibly laborious, but according to Money Inc., "if a Game of Thrones episode has 10-minutes of CGI, which equates to $800,000." As CGI's applications and effectiveness continue to impress, producers may decide they are willing to pay more, which would be welcome news to many fans.
http://bgr.com/2016/06/29/game-of-thrones-battle-bastards-effects/
https://en.wikipedia.org/wiki/Computer-generated_imagery
http://moneyinc.com/much-costs-make-single-episode-game-thrones/
https://en.wikipedia.org/wiki/Fur
I was inspired to write about computer-generated imagery, or CGI, after watching the following video (definitely worth the watch if you are a fan) on the Game of Thrones episode "Battle of the Bastards." It focuses mainly on how producers could augment the scenes to drastically increase the scope of the shot from a few dozen horses and actors to full scale armies.
https://vimeo.com/172374044
CGI is usually used to create 3D images, although it can also be applied in 2D formats. I was surprised to see that even the backgrounds of many scenes shot with CGI that I anticipated would be authentic places were actually created with algorithms. Using these strategies, programmers can manipulate a blank canvas into a realistic topography. They are able to achieve this authenticity by coding in midpoint formulas and and meshing surfaces together.
An example of CGI damp fur.
There has also been a significant amount of effort put into creating realistic images of skin, cloth, and fur. Programmers have struggled to imitate the natural reactions of the materials to movement, but accuracy in their portrayal can be achieved down to 0.1 millimeters. Another application of CGI is in user-impacted formats such as flight simulators. These programs have to be designed so that functions performed by the user are accurately portrayed in the visualization.
One of the main fallbacks for CGI is its cost, both in time and money. Not only is it incredibly laborious, but according to Money Inc., "if a Game of Thrones episode has 10-minutes of CGI, which equates to $800,000." As CGI's applications and effectiveness continue to impress, producers may decide they are willing to pay more, which would be welcome news to many fans.
http://bgr.com/2016/06/29/game-of-thrones-battle-bastards-effects/
https://en.wikipedia.org/wiki/Computer-generated_imagery
http://moneyinc.com/much-costs-make-single-episode-game-thrones/
https://en.wikipedia.org/wiki/Fur
iRobot
iRobot's headquarters in Bedford, MA.
Founded by three MIT graduates in 1990, iRobot has become one of the leaders in robotic technology today. They have sold over 14 million home robots and over 5,000 are used in defense fields, such as with the military and police forces. Specifically, iRobot's PackBot has been used to assist in recovery efforts ranging from 9/11 to the Fukushima nuclear disaster. Some robots, such as the Seaglider, are able to operate underwater, drastically increasing the range of possibilities for the robots. Many of iRobot's earlier creations operated similarly to a common remote control car, but more recent home robots such as the Roomba operate autonomously.
The Roomba robot avoiding a staircase.
The Roomba, a vacuum cleaner, is iRobot's most popular invention with over 10 million units sold worldwide. Using only two wheels, the Roomba is able to navigate around obstacles and even dropoffs and detect dirty spots on the floor. When the bumper outfitted on the front of the Roomba detects it has run into something (an input), it will internally convey a command to change directions (an output), thus allowing the robot to run independent of human intervention. Unlike some other autonomous robotic vacuum cleaners, the Roomba does not map out rooms that it cleans; it instead operates by tracing walls and going at random angles until it encounters an obstacle. Newer and more expensive versions of Roomba's are also able to incorporate infrared technology. Because of this, the robots have another tool with which to detect obstacles, as well as a way to search for their charging base.
Although iRobot's original inventions were defense-centric, the company recently sold its military operations front to Arlington Capital Partners. This was so that they can focus more on consumer goods, so there should be many exciting developments to come.
https://en.wikipedia.org/wiki/Roomba
https://en.wikipedia.org/wiki/IRobot
http://www.irobot.com/About-iRobot/Company-Information/History.aspx
http://roboticsandautomationnews.com/2015/07/23/irobot-second-quarter-financial-results-exceed-expectations/921/
http://www.irobot.com/For-the-Home/Vacuuming/Roomba.aspx
iRobot's headquarters in Bedford, MA.
Founded by three MIT graduates in 1990, iRobot has become one of the leaders in robotic technology today. They have sold over 14 million home robots and over 5,000 are used in defense fields, such as with the military and police forces. Specifically, iRobot's PackBot has been used to assist in recovery efforts ranging from 9/11 to the Fukushima nuclear disaster. Some robots, such as the Seaglider, are able to operate underwater, drastically increasing the range of possibilities for the robots. Many of iRobot's earlier creations operated similarly to a common remote control car, but more recent home robots such as the Roomba operate autonomously.
The Roomba robot avoiding a staircase.
The Roomba, a vacuum cleaner, is iRobot's most popular invention with over 10 million units sold worldwide. Using only two wheels, the Roomba is able to navigate around obstacles and even dropoffs and detect dirty spots on the floor. When the bumper outfitted on the front of the Roomba detects it has run into something (an input), it will internally convey a command to change directions (an output), thus allowing the robot to run independent of human intervention. Unlike some other autonomous robotic vacuum cleaners, the Roomba does not map out rooms that it cleans; it instead operates by tracing walls and going at random angles until it encounters an obstacle. Newer and more expensive versions of Roomba's are also able to incorporate infrared technology. Because of this, the robots have another tool with which to detect obstacles, as well as a way to search for their charging base.
Although iRobot's original inventions were defense-centric, the company recently sold its military operations front to Arlington Capital Partners. This was so that they can focus more on consumer goods, so there should be many exciting developments to come.
https://en.wikipedia.org/wiki/Roomba
https://en.wikipedia.org/wiki/IRobot
http://www.irobot.com/About-iRobot/Company-Information/History.aspx
http://roboticsandautomationnews.com/2015/07/23/irobot-second-quarter-financial-results-exceed-expectations/921/
http://www.irobot.com/For-the-Home/Vacuuming/Roomba.aspx
Subscribe to:
Posts (Atom)