Category: Science

Google claims its TPU improves machine learning

victorian-education-2Google claims that its Tensor Processing Unit (TPU), advances machine learning capability by a factor of three generations.

Google CEO Sundar Pichai told the Google’s I/O developer conference that TPUs deliver an order of magnitude higher performance per watt than all commercially available GPUs and FPGA.

Pichai said the chips powered the AlphaGo computer that beat Lee Sedol, the world champion in the incredibly complicated game called Go. Google still is not going into details of the Tensor Processing Unit but the company did disclose a little more information in its blog.

“We’ve been running TPUs inside our data centres for more than a year, and have found them to deliver an order of magnitude better-optimised performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law),” the blog said. “TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation. Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models, and apply these models more quickly, so users get more intelligent results more rapidly.”

The tiny TPU can fit into a hard drive slot within the data centre rack and has already been powering RankBrain and Street View, the blog said.

What Google is not saying is what a TPU actually is and if it will be a replacement for a CPU or a GPU. Word on the street is that the TPU could be a form of chip that implements the machine learning algorithms that are crafted using more power hungry GPUs and CPUs.

Secret meeting mulls creating plastic humans

1431613943_valeriya-lukyanova-467More than a hundred scientists, lawyers, and entrepreneurs gathered in secret to discuss the radical possibility of creating a synthetic human genome.

According to the New York Times attendees were told to keep a tight lip about what took place, but someone must have dropped a hint to the press.  Synthetic human genome is a big step up from gene editing – it uses chemicals to manufacture all the DNA contained in human chromosomes. It relies on the custom-designed base pair series and geneticists wouldn’t be bound by the two base pairs produced by nature.

They could, in theory build microbes, animals and humans. So a company could build the right human for the job.

Obviously this is ethically a minefield and the world of science appears to have not really got the hang of how to succeed in getting the public on its side.  It seems to think that if there is a public debate, then religious nutjobs will lean on politicians who will put the lid on the whole thing. However keeping the meeting secret though has created an internet conspiracy stir and reports of the meeting appear to be getting out of hand.

George Church, a professor of genetics at Harvard medical school and a key organizer of the proposed project said that the meeting wasn’t really about synthetic human genomes, but rather it was about efforts to improve the ability to synthesize long strands of DNA, which geneticists could use to create all manner of animals, plants and microbes.

Yet the original name of the project was “HGP2: The Human Genome Synthesis Project”. What’s more, an invitation to the meeting clearly stated that the primary goal would be “to synthesise a complete human genome in a cell line within a period of ten years”.

Church said the meeting was secret because his team has submitted a paper to a scientific journal, and they’re not supposed to discuss the idea publicly before publication.

Church does want to build a complete human genome in a cell line within ten years. So far scientists have synthesized a simple bacterial cell.

Watson gets a job as a lawyer

stupid-lawyer1Biggish Blue’s AI supercomputer Watson has just got a job as a bankrupcy lawyer.

Global law firm Baker & Hostetler has bought itself Ross, the first artificially intelligent attorney built by ROSS Intelligence. Ross will be employed in the law firm’s bankruptcy practice which currently employs more than 50 lawyers.

Ross can understand your questions, and respond with a hypothesis backed by references and citations. It improves on legal research by providing you with only the most highly relevant answers rather than thousands of results you would need to sift through.

It constantly monitors current litigation so that it can notify you about recent court decisions that may affect your case, and it will continue to learn from experience, gaining more knowledge and operating more quickly, the more you interact with it.

Andrew Arruda, ROSS Intelligence co-founder and CEO, other law firms have signed for licences with Ross, and more announcements are expected.

It is nice that lawyers will be the first race of sharks to be wiped out by our robotic overlords.  If we could replace politicans next that would be even better.

 

Smartphones give us ADHD symptoms

mobileSmartphone use is creating similar symptoms to  Attention Deficit Hyperactivity Disorder (ADHD) a new study has suggested.

Research Associate in Psychology, University of Virginia Kostadin Kushlev, recruited 221 students at the University of British Columbia to participate in a two-week study to look at the effects of smartphones on them.

During the first week, he asked half the participants to minimise phone interruptions by activating the “do-not-disturb” settings and keeping their phones out of sight and far from reach. We instructed the other half to keep their phone alerts on and their phones nearby whenever possible. In the second week participants who had used their phones’ “do-not-disturb” settings switched on phone alerts. The order in which we gave the instructions to each participant was randomly determined by a flip of a coin.

Then Kushlev measured inattentiveness and hyperactivity by asking participants to identify how frequently they had experienced 18 symptoms of ADHD over each of the two weeks. These items were based on the criteria for diagnosing ADHD in adults as specified by the American Psychiatric Association’s Diagnostic and Statistical Manual (DSM-V). The results were more frequent phone interruptions made people less attentive and more hyperactive.

ADHD is a neurodevelopmental disorder and Kushlev is not saying that smartphones can cause ADHD, nut the findings suggest that people can act like it. He thinks that smartphones could be harming the productivity, relationships and well-being of millions.

” Our findings suggest that our incessant digital stimulation is contributing to an increasingly problematic deficit of attention in modern society. So consider silencing your phone – even when you are not in the movie theater. Your brain will thank you,” Kushlev wrote.

Top musicians look for medical cures

peter-gabriel-850-100Muscians Peter Gabriel, St. Vincent , Jon Hopkins, and Esa-Pekka Salonen are helping an initiative headed up by former Nokia design head Marko Ahtisaari — explore the future of musical medicine.

The four musicians are going to help The Sync Project as advisors, roles that’ll necessitate working with the scientists researching music’s therapeutic properties and helping to raise the project’s awareness.

Gabriel and St. Vincentare art-rock veterans, Hopkins is an accomplished electronic producer, and Salonen conducts the London Philharmonia Orchestra. However Ahtisaari is more interested in their value as thinkers than their musical legends.

He said that he needed musicians and creators who have an active relationship with technology. It wasn’t so much about the contents of the music, or to commission any work, it was because they were creative thinkers.

The idea is to build a biometric recommendation engine for music and create musical treatment programs for medical conditions that match the efficacy of drug-based treatment without subjecting patients to the dangers and side effects of pharmacological programs.

Ahtisaari cites treatment for Parkinson’s disease as an example. Users could contribute data from their streaming service of their choice and sensors from their phones or wearable devices that characterize their physical response to certain music.

Collected in bulk, that data could inform more specific clinical trials testing the effects of various musical qualities on patient mobility.

The final result would be a personalized playlist, one that aids movement and changes with the patient’s activity.

The project’s musical advisors can’t shape its medical aspects, but Ahtisaari is hoping they can help push the conversation regarding music’s therapeutic potential forward among both musicians and listeners.

 

Robot has humans in stiches

Stitches_tnScientists have created a robot that stitches up living animals without a real doctor telling it what to do.

The big idea is to move toward autonomous surgical robots, removing the surgeon’s hands from certain tasks that a machine might perform all by itself.

In small tests using pigs, the robot performed at least as well, and in some cases a bit better, as some competing surgeons in stitching together intestinal tissue.

Dr. Peter C.W. Kim of Children’s National Health System in Washington told the Science Translational Medicine journal that the big idea was not to replace surgeons.

“If you have an intelligent tool that works with a surgeon, can it improve the outcome? That’s what we have done.”

Robot-assisted surgery has been controversial, as some studies have shown it can bring higher costs without better outcomes.

The Robot is designed to do one specific task, stitch up tissue. Dubbed the Smart Tissue Autonomous Robot (STAR) it is a bit like a programmable sewing machine.

Kim’s team at Children’s Sheikh Zayed Institute for Pediatric Surgical Innovation took a standard robotic arm and equipped it with suturing equipment plus smart imaging technologies to let it track moving tissue in 3-D and with an equivalent of night vision. They added sensors to help guide each stitch and tell how tightly to pull.

The surgeon places fluorescent markers on the tissue that needs stitching, and the robot takes aim as doctors keep watch.

STAR could reconnect tubular pieces of intestinal tissue from pigs. Any soft-tissue surgeries are tricky for machinery because those tissues move out of place so easily. And the stitches in these connections must be placed precisely to avoid leaks or blockages.

Using pieces of pig bowel outside of the animals’ bodies as well as in five living but sedated pigs, the researchers tested the STAR robot against open surgery, minimally invasive surgery and robot-assisted surgery.

None of the pigs ended up as someone’s dinner and the machine did a pretty good job. In a couple of cases STAR had to reposition fewer stitches than the surgeons performing minimally invasive or robot-assisted suturing. But in the living animals, the robot took much longer and made a few suturing mistakes while the surgeon sewing by hand made none.

Kim, whose team has filed patents on the system, said the robot can be sped up. He hopes to begin human studies in two or three years.

Self-driving cars spark sex fears

rockCanadian Government officials have finally revealed that one of the reason that they don’t like the concept of self-driving cars is that people will have sex in them.

Of course people have sex in cars now, sometimes when they are moving, but US Federal bureaucrats are worried that semi-autonomous cars that don’t require much input from the driver will result in their input going elsewhere.

Barrie Kirk of the Canadian Automated Vehicles Centre of Excellence has said that the smarter cars get the more bonking will take place.

“I am predicting that, once computers are doing the driving, there will be a lot more sex in cars.”

“That’s one of several things people will do which will inhibit their ability to respond quickly when the computer says to the human, ‘Take over.”‘

Federal officials, who have been tasked with building a regulatory framework to govern driverless cars, highlighted their concerns in briefing notes compiled for the Canadian Transport Minister Marc Garneau.

The report said that the issue of the attentive driver is … problematic.

“Drivers tend to overestimate the performance of automation and will naturally turn their focus away from the road when they turn on their auto-pilot,” said the report.

 

Inject Google into your eyeball

13534691._SX540_Search engine Google appears to believe there are people out there who are willing to having things injected into their eyeballs.

Google has filed a patent for a vision-correcting electronic device, This sounds pretty good until you discover it has be injected directly into your eye.

Google said that the device is designed to help the focusing of light onto the retina, resulting in the correction of poor vision. It will contain its own storage, radio and lens and will apparently be powered wirelessly from an energy harvesting antenna. Good vision is better than a poke in the eye with a short stick after all.

Knowing Google it will probably force you to see advertising, or some other atrocity, but its biggest problem is that it will have to be injected. This is probably one of the worst nightmares anyone can have.  It is all fun and games until someone has their eye out.

Although Google has filed the patent, that’s no guarantee that we’ll see the idea come to life anytime soon, or even at all.

 

Siemens’s spider make plastic products not webs

portable.3dx1000Siemens thinks that it will not just be engineers, designers, and workmen on a project, but an army of small robot spiders, 3D printing and weaving together plastic.

In a lab in Princeton, New Jersey, the company’s researchers are testing spider-like robots that extrude not silk but plastic, thanks to portable 3-D printers. The robots can work together autonomously to create simple objects.

The robots use onboard cameras as well as a laser scanner to interpret their immediate environment. Each robot autonomously works out which part of an area it can cover, while other robots use the same technique to cover adjacent areas.

The project leader Hasan Sinan Bank said that by dividing each area into vertical boxes, the robots can work collaboratively to cover even complex geometries in such a way that no box is missed.

“No one else has attempted to do this using mobile manufacturing,” he said.

The work in shifts,  after two hours of work, a tired spider will transmit its data to a replacement, and then walk back and recharge itself.

The technology is all new but could be earmarked for large projects like shipbuilding or construction work.

The robots are partially automated, but will eventually become more fully autonomous, learning how to interact with their environment.

Of course it is not a real spider, it only has six legs, but it might grow a pair and the project develops.

 

Nvidia teaches a car called Dave to drive itself

Confessions_of_a_Driving_Instructor_FilmPosterAn Nvidia engineering team has built a self-driving car with one camera, one Drive-PX embedded computer and only 72 hours of training data.

Nvidia published an academic preprint of the results of the DAVE2 project entitled End to End Learning for Self-Driving Cars.

DAVE2 is named after a 10-year-old Defense Advanced Research Projects Agency (DARPA) project known as DARPA Autonomous Vehicle (DAVE). Coincidently it is also the name of the astronaut in 2001,  a Space Oddessy. The phrase “I can’t do that Dave” is now the blue screen of death”  of robotic history.

The Nvidia team trained a convolutional neural network (CNN) to map raw pixels from a single front-facing camera directly to steering commands. Nvidia’s breakthrough is the autonomous vehicle automatically taught itself by watching how a human drove, the internal representations of the processing steps of seeing the road ahead and steering the autonomous vehicle without explicitly training it to detect features such as roads and lanes.

Although in operation the system uses one camera and one Drive-PX embedded computer, the training system used three cameras and two computers to acquire three-dimensional video images and steering angels from the vehicle driven by a human that were used to train the system to see and drive.