Tag: university of washington

Scientists plan test to see if the entire universe is a simulation created by futuristic supercomputers

US scientists are attempting to find out whether all of humanity is currently living a Matrix-style computer simulation being run on supercomputers of the future.

According to researchers at the University of Washington, there are tests that could be done to begin to work out whether we are in fact real, or merely a simulation created by a futuristic android on its lunch break.

Currently, computer simulations are decades away from creating even a primitive working model of the universe. In fact, scientists are able to accurately model only a 100 trillionth of a metre, with work to create a model of a full human being still out of reach.

By looking for underlying patterns, physicists believe that it may be possible to work out if we are existing in a computer created universe, created many years in the future.  Looking at constraints imposed on simulations by limited resources could show signs that we are mere bit-part players in a Matrix-style film plot.

It will take many years to reach the computational power to give a real glimpse of whether we are living in a simulation, the scientists contend, but even by looking at the tiny portion of the universe that we can currently accurately model, it may be possible to detect ‘signatures’ of constraints on physical processes that could point to a simulation.  

The researchers suggest that a signature could show up as a limitation in the energy of cosmic rays, for example.  By testing the behaviour of cosmic rays on underlying ‘lattice’ frameworks governing rules of physics that could exist in future models of the universe, the researchers could find patterns that could point to a simulation.

“This is the first testable signature of such an idea,” one of the researchers, Martin Savage, said.

Aside from the rather mind-boggling proposition that we may be part of a computer simulation, another researcher pointed out that this would bring up the possibility of inter-universe computer platforms, and the potential to communicate across these.

“Then the question is, ‘Can you communicate with those other universes if they are running on the same platform?’” UW graduate student, Zohreh Davoudi, asked.

Bionic lenses send text straight to your eye

Scientists have taken another step towards creating the bionic man of sci-fi lore with a wearable display mounted in a contact lens.

A team from the University of Washington and Aalto University in Finland managed to squeeze all the necessary equipment into a wearable lens that could display information right before your eyes.

This consists of an antenna which draws on power sent by an external source, as well as an integrated circuit which is able to store energy for use in a transparent sapphire chip which holds a blue LED.

Granted, the researchers are only able to produce a display of one pixel at the moment, and testing so far has only been on rabbits, but the possibilities are there.

The next step for the team is to get actual text onto the contact lens. When it’s there, the universities are considering applications like reading text messages and emails in the near future. 

A Doom-esque HUD would certainly be useful, with the researchers pointing to visual levels of glucose levels as an example. Equally we can imagine having your bank balance flickering up as you approach a pub could be handy, as well as a large hovering arrow ushering you home after six pints would be worth a shot.

It may be some time before futuristic developments like subtitles appearing in your field of vision or email alerts flashing up in your lenses are actually available from Specsavers, however.

The ‘proof of concept’ design may not have done any damage to the test bunny, but there are still plenty of hurdles in getting out of the early prototype stage.

The researchers did manage to solve the problem of how to allow information to appear unblurred, considering that a human eye has a minimum focal point of seven centimetres. This involved developing substantially thinner and flatter lenses than are typically available, allowing the image to be focused on the retina.

Improvements will need to be made to the antenna power system if the lenses are to be made usable. For example, while it was possible to power a display from one metre, this was reduced to just two centimetres when placed on the rabbit’s eye.

Still, the researchers are quite clearly excited, deciding to plump with the slightly unnerving “Terminator-style info-vision takes step towards reality” as the press release’s headline. 

Scientists create photon-based bio transistor

Scientists have been making inroads into interfacing the human body with electronic systems for some time now.

But a massive step towards true integration and communication between human bodies and electronics devices has been made by a team of researchers creating an organic transistor.

The scientists at the University of Washington say they have constructed the otherworldly transistor which uses photons instead of electrons to interface with the world of the living.  Good news of course for Apple fans seeking an even more intimate relationship with their iPads.

While electronic devices use electrons to send information around their components, a similar job is performed using ions or protons – positively charged hydrogen atoms –  in human bodies.

Communication between living matter and electronics has been problematic because translating electronic signal into an ionic signal, or vice versa, has proved difficult. But with a new biomaterial derived from crab shells and squid, the team have found it potentially achievable as it can conduct protons.

As protons are used in living cells to transmit signals coming from the brain or other muscles, it could eventually be possible to control such functions directly via a computer.  And this is made more likely from the development of a transistor based on the material chitosan.

The team has been able to create a field effect transistor containing a gate, drain and source terminal as usually necessary, but using protons rather than electrons.  The prototype measures around five microns wide.

The transistor is then able to switch a proton current off, just as a regular transistor would in a microchip.

The current version of the transistor will require a material other than a silicon base before it can be accepted by a human body, however.

Once some major problems are ironed out, and this could be some time, there are some interesting potential applications for communicating between the human body and electronic devices.

It is thought that it should be possible monitor biological processes.  Whether this will involve wirelessly contacting Domino’s when a chip senses your stomach is empty is unclear, but we can see the opportunity for lucrative research in that area.

Intel funds silicon photonics foundry service

Intel’s supporting a new University of Washington programme designed to dramatically cut the cost of manufacturing silicon photonic chips.

The Optoelectronics Systems Integration in Silicon (OpSIS) project will allow ‘shuttle runs’, in which researchers cut costs by sharing silicon wafers between multiple projects. A single circuit design might use only a few square millimetres, says assistant professor of electrical engineering Michael Hochberg, so that shuttle runs can cut costs by more than 100 times.

“We would like the photonics industry, 10 years from now, to function in a way that’s very similar to the electronics industry today,” he says. “People building optoelectronic systems will send designs out to an inexpensive, reliable third party for manufacturing, so they can focus on being creative about the design.”

In developing the rules and protocols, the team aims to ensure that even non-specialists can design and build functioning chips that integrate photonics and electronics.

“You want a minimum of rules because people are going to use the technology in ways that you never imagined,” says Carver Mead, professor emeritus at the California Institute of Technology. “You want people to use it in ways that seem crazy.”

Silicon photonics provides a faster, lower-power means for moving data around than electrons; a single optical fiber or waveguide can carry many terabits per second of data.

Companies including Intel and IBM have made major breakthroughs in the technology over the last year.

“OpSIS will enhance the education of US engineering students, giving them the opportunity to learn the new optical design paradigm,” says Intel’s chief technology officer, Justin Rattner.

“The ability to produce such low-cost silicon chips that manipulate photons, instead of electrons, will lead to new inventions and new industries beyond just data communications, including low-cost sensors, new biomedical devices and ultra-fast signal processors.”

OPSIS has already signed up a few early users who are participating in so-called ‘risk runs’ to test the new protocols.

One such is John Bowers, a professor of electrical and computer engineering at the University of California, Santa Barbara who has designed a circuit for the first run.

“By focusing research of many different groups in one process line, that allows you to advance a library of components and processes faster than any one group could do on its own,” Bowers said. “It enables a faster evolution of photonic devices.”

Eventually, the centre plans to offer three runs per year, each of which could accommodate 30 to 40 users. The chips will be built by BAE Systems.

Kinect used for robotic surgery

Researchers have adapted Microsoft’s Kinect technology for use with robotic tools in surgery, making remote treatment more straightforward and reducing costs.

The main function of the technology is to utilise the array of cameras and sensors – which allow videogame players to control their Xbox 360 without a controller – in order to provide surgeons with feedback when performing surgery.

According to Howard Chizeck, professor of electrical engineering at the University of Washington, robotic tools, which are commonly used for minimally invasive surgery, do not allow the surgeon any sense of touch when operating.

“What we’re doing is using that sense of touch to give information to the surgeon, like ‘You don’t want to go here,’” Chizeck says.

It is noted that tubes are often used with remotely controlled surgical instruments on the end which are then inserted into the patient in order to minimise scarring.

Surgeons are then able to control the instruments with input devices similar to complex joysticks, and use tiny cameras in the tubes to see inside the patient.

The problem for the surgeons is that they have no realistic feedback from the tools, so when they move a surgical instrument into something solid the instrument will be forced to stop, while the joystick is still able to move.

To resolve this graduate student Fredrik Ryden wrote code that allowed the Kinect to map the environment in three dimensions before relaying it back to the user. 

With this information it is then possible to restrict certain areas by stopping the joystick moving.  This could mean that the joystick would be able to follow along the line of a bone, or even block surgical tools from hitting vital organs inadvertently.

“We could define basically a force field around, say, a liver,” said Chizeck. “If the surgeon got too close, he would run into that force field and it would protect the object he didn’t want to cut.”

“It’s really good for demonstration because it’s so low-cost, and because it’s really accessible,” Ryden, said. “You already have drivers, and you can just go in there and grab the data. It’s really easy to do fast prototyping because Microsoft’s already built everything.”

Before the idea to use a Kinect for the system, which took just one weekend to complete, a similar alternative would have cost around $50,000, according to Chizeck.

Furthermore the system will allow greater reliability for long distance use, enabling doctors in major cities to perform long distanced surgery in remote areas.

This could also lead to applications for use in disaster relief or in battlefield use.

“Suppose there’s an earthquake somewhere,” Chizeck said. “First responders could get victims to a van with a satellite dish on top and the tools inside, and a surgeon somewhere else could perform the surgery.”

With a paper due to be published soon the researchers will now focus on scaling down the sensors to a size deemed appropriate for surgical use, and the resolution of the video will need to be increased before it is usable.

Computers to be built on contact lens

Scientists working for the University of Washington are working on solar powered contact-lenses which come with their own computers on board.

According to Stumble Upon Professor Babak Amir Parviz and his team want to embed hundreds of semitransparent LEDs onto a thin lens, letting wearers experience augmented reality right through their eyes.

The computers will allow a range of things from health monitoring to just bionic sight that the six million dollar man would be proud of.

The lens uses sensors and wireless technology. Parviz said that the lenses don’t need to be very complex to be useful. Even a lens with a single pixel could aid people with impaired hearing or be incorporated as an indicator into computer games.

With more colours and resolution, the repertoire could be expanded to include displaying text, translating speech into captions in real time, or offering visual cues from a navigation system. With basic image processing and Internet access, a contact-lens display could unlock whole new worlds of visual information, unfettered by the constraints of a physical display.

Soon it will be possible to connect your eyes to a navigation system so that you don’t get lost. This will be stage one in a technology which will ultimately mean that what we see is not reality. Mind you,  the Buddhists have been saying that for centuries.

Computer hook-up makes brains stronger

Brains can adapt quickly to computers after being hooked up, and even become stronger for it, according to US boffins.

University of Washington researchers studied signals on the surface of the brains of epilepsy patients by using ‘imagined’ movements to move a cursor, and found that when they watched it respond to their thoughts, it caused their brain signals to become stronger than usual.

It was claimed that this could in the future benefit stroke victims or others with brain damage, and that brains could learn very quickly how to control external devices such as computers or replacement limbs.

“Body builders get muscles that are larger than normal by lifting weight,” said lead author Kai Miller. “We get brain activity that’s larger than normal by interacting with brain-computer interfaces. By using these interfaces, patients create super-active populations of brain cells.”

Co-author Rajesh Rao said that people had been looking at how imagined movements could control computers for some time, but study proved how remarkable the brain was with its learning ability.

Research on the ability to control devices with brain signals has been done for a while, but this is an important example of how it might be able to train the brain up. Who knows – in the future we could see it being used as an educational tool.

We’ve already seen games like Brain Training for the Nintendo DS and the success of movement-based games on the Wii. Could we see games complete with electrodes connected up to your head in the future?

Pong’s already been done…




Suspended animation is not far away

Boffins are not far away from sticking people in suspended animation.

Apparently a gas which is used in weapons of mass destruction might be a good candidate to shove people into life-saving suspended animation.

Hydrogen sulphide is toxic in large doses, but small amounts of the gas can make animals appear dead for a while then allow them to wake up unharmed.

Biochemist Mark Roth, from the Fred Hutchinson Cancer Research Center in Seattle, said that in the future an emergency medical technician might give hydrogen sulphide to someone suffering serious injuries and they might become a little more immortal giving them time to get the care they need.

Suspended animation takes place in the natural kingdom, with bears hibernating through winters while plant seeds and bacterial spores are able to biologically snooze for millions of years.

Roth found that hydrogen sulphide  bonds at spots in bodies that would usually be occupied by oxygen.

He did it to a mouse and all you had to do was put it in room temperature and it was no worse for the wear. So far he has not tried it out on a human yet.