Just this week, Amazon received a patent for the smallest drone ever. The miniaturized, unmanned,aerial vehicle (UAV) is designed to assist users in a number of ways — from recovery of lost persons and items, to providing assistance to policemen and firefighters. It’s Amazon’s first venture into a drone product for consumers – not just for delivery. The drone’s features make it an all-around personal assistant, equipped with voice-control and Alexa (Amazon’s AI-equipped voice assistant). Accordingly, it can respond to voice commands such as “follow me” and “hover,” allowing for varied uses. Amazon also plans to make the drone quite small – small enough to fit in your pocket or to dock on a police officer’s radio. Using RFID-search capabilities and facial recognition software, the drone can help locate lost persons or even your elusive car keys. In terms of locating people, the patent states: The UAV can receive a “find Timmy” command, which can include the “search” routine, and possibly an “identify Timmy” subroutine to locate a person identified as “Timmy.” In some examples, Timmy can have an RFID tag sewn into his clothes or a bar code printed on his shirt to facilitate identification. AN UNMANNED FUTURE Today’s advances in unmanned vehicle technology are unprecedented. We’re definitely up for a future of unmanned everything (or at least many things), from aerial drones to driverless cars, self-driving factory workers, and even submarines and boats. Amazon has taken a major step in the right direction as the company moves beyond using drones for delivery. While many of the intended uses of this mini drone are still considered illegal in the United States, the FAA is supposedly updating their rules for commercial drone use within the next five years. The future is bright for unmanned vehicle technology. Rather than fear such technological advancements, we should look to the added value they offer to the way we work and live.
It looks cute, but this Russian-made robot had recently been “arrested,” after making rounds in a political rally, recording voters’ opinions about a candidate’s team. It sounds fairly harmless, but Promobot seemed to have made enough trouble to make the local authorities ask policemen to apprehend and detain the robot. “Police asked to remove the robot away from the crowded area, and even tried to handcuff him,” the Promobot spokesperson said. This isn’t the first time that Promobot got itself in a fair amount of mischief—it’s run away from its home laboratory before, twice. The mad run for freedom ended with a battery-drained robot blocking thickening traffic in the street; the programmers were left scratching their heads. The curious event, however, was called a hoax and a publicity stunt. AI ACCOUNTABILITY? So, what happens when artificially intelligent beings breaks the law? The scenario might seem like something straight out of sci-fi, but it’s a concern that’s quickly realizing itself as the technology develops more and more rapidly. Back in February, the US legally designated Google’s software as “the driver” in its autonomous cars. Which begs the question, who’s accountable for collisions caused by the software? Or in cases when there is a passenger in the autonomous car, and a third party is injured or killed, who gets sued? As the Washington Post points out, these questions boil down to liability, and not necessarily insurance. John Townsend, a spokesman for AAA Mid-Atlantic told the paper, “If you have a catastrophic failure of a product, you can sue the bejeezus out of a company, if the product causes the crash.” While these cases may be more straightforward, the difficulty compounds in vehicles where humans do have the option of manually taking over. Aside from driverless cars, other applications of AI raise different questions. For example, last year two artists created a bot and enabled it to surf the [...]
NASA Just Found a Gamma Ray Star System: An international team of scientists has found the brightest gamma-ray binary ever seen, and it’s the first to be seen outside the Milky Way galaxy. The team combined data from NASA’s Fermi Gamma-ray Space Telescope with those from other facilities and confirmed that what was once thought was just a high-mass X-ray binary (HMXB) was in fact, a gamma-ray binary system. Their findings have been published in The Astrophysical Journal. The newly found gamma-ray binary, named LMC P3, was discovered in a small nearby galaxy called Large Magellanic Cloud (LMC), located 163,000 light years away. Gamma-ray binaries are systems wherein there are two stars, one orbiting the other. One is usually a massive star and the other is either a black hole or a neutron star (an extremely magnetic star), and are very rare, with only five found in our galaxy to date. And so far, LMC P3 is the most luminous gamma-ray binary system ever found in terms of gamma rays, X-rays, radio waves and visible light. “Fermi has detected only five of these systems in our own galaxy, so finding one so luminous and distant is quite exciting,” NASA’s Goddard Space Flight Center lead researcher Robin Corbet says. “Gamma-ray binaries are prized because the gamma-ray output changes significantly during each orbit and sometimes over longer time scales. This variation lets us study many of the emission processes common to other gamma-ray sources in unique detail.” COSMIC DEATH RAYS Having two extremely high-energy bodies within a system undoubtedly causes immense energy to be unleashed. On a regular day, the ozone layer protects us from gamma rays beaming around from outer space. However, gamma-ray bursts can wipe out life in an entire planet, if that planet happens to be in its beam direction. And some postulate that such an event did just that to Earth 450 million [...]
A new robot, PepperPay, just debuted at the TechCrunch Disrupt SF Hackathon. Waiting in long lines at the store or fighting with the self checkout may soon be a thing of the past. Robots and most technologies are built to make our lives easier, removing the small inconveniences like long lines or looking for a specific book. It’s all about instant gratification, right? Despite the convenience, the development of this type of technology has spurred the fear that robots are out to take jobs from regular humans. To that end, this announcement from TechCrunch Disrupt SF Hackathon isn’t necessarily great news for blue-collar workers. A new robot that uses artificial intelligence (AI) and machine learning just debuted at the competition. The new PepperPay robot, built on the Pepper companion robot, can identify items based on just a picture or a snapshot of them. This allows customers to breeze through checkout, without the fuss often associated with barcode scanners in self-checkout counters. After waiting 30 minutes in line to buy toothpaste at Walgreens, the developers Dave Idell, Adam Chew, and Nisha Garigarn were inspired to create PepperPay. They used IBM Watson’s image recognition technology, and handled the transactions through PayPal. While the current robot is integrated with the Pepper bot, the system could be adapted to a simple iPad. That removes the need to buy an actual robot or specialized hardware. When asked what’s next for PepperPay, the developers has this to say: “We don’t know too much about the retail game, but we imagine a future where something like this changes how we shop. We invite any and all to take a look at our code on github and see where they can take it!”