Tag: stanford

Scientists promise to speed up chips

GrapheneScientists at Stanford School of Engineering believe they have devised a method to speed up data transport in semiconductors by as much as 30 percent in the future.

While a typical semiconductor has millions of transistors connected by tiny copper wires, it’s the sheath that has given the Stanford scientists inspiration.

They said that tantalum nitride has been used to sheath the copper wires within chips.

But experiments show that if graphene is used as a sheath, electrons can fly through the copper wires faster.

The sheath over the tiny copper wires prevents them from interacting with the silicon and also conducts electricity.

But the Stanford team shows that a graphene layer would be eight times thinner than one made from tantalum nitride. But using graphene as a layer can also act as an auxiliary conductor of electrons as well as isolating the copper from the silicon.

But while the method holds promise, there are still hurdles to adopting graphene to this purpose. That would include methods to grow graphene directly onto the wires during mass production.

Stanford bioengineers develop superfast energy efficient chips

Stanford bioengineers have developed a chip which is 9,000 times faster and uses significantly less power than a typical PC.

The Neurogrid circuit board can simulate more neurons and synapses than other brain mimics on the power it takes to run a tablet.

Kwabena Boahen, associate professor of bioengineering at Stanford, in an article for the Proceedings of the IEEE, said that the brain was a better model for computing.

Boahen and his team have developed Neurogrid, a circuit board consisting of 16 custom-designed “Neurocore” chips. Together these 16 chips can simulate one million neurons and billions of synaptic connections. The team designed these chips with power efficiency in mind. Their strategy was to enable certain synapses to share hardware circuits. The Neurogrid is the size of a tablet and will probably end up controlling a humanoid robot.

The downside is that you have to know how the brain works to program Neurocore, and the next stage is to create a neurocompiler so that you would not need to know anything about synapses and neurons to able to use one of these.

Million-neuron Neurogrid circuit boards cost about $40,000. Boahen believes dramatic cost reductions are possible. Neurogrid is based on 16 Neurocores, each of which supports 65,536 neurons. Those chips were made using 15-year-old fabrication technologies.

By switching to modern manufacturing processes and fabricating the chips in large volumes, he could cut a Neurocore’s cost 100-fold – which means you could have a million-neuron board for $400 a throw. 

Researchers improve wi-fi

Researchers at the swanky US university of Stanford have worked out a way of improving wi-fi reception.

One of the problems of wi-fi is that it can be buggered up when you have many people packed into a flat or office building.

However, researchers at Stanford claim to have found a way to turn crowding into an advantage.

Using a dorm on the Stanford campus, they built a single, dense wi-fi infrastructure that each resident can use and manage like their own private network.

Dubbed BeHop, the system can be centrally managed for maximum performance and efficiency while users still assign their own SSIDs (service set identifiers), passwords and other settings.

Yiannis Yiakoumis, a Stanford doctoral student who presented a paper at the Open Networking Summit this week said that the whole thing can be managed with cheap consumer-grade access points and software-defined networking.

Each household installs its own wi-fi network with a wired broadband link out to the Internet. Each of those networks may be powerful enough to give good performance under optimal circumstances within the owner’s unit, but it may suffer from interference with all the other privately run networks next door.

Yiakoumis and his mates built a shared network of access point using home units provided by NetGear. They modified the firmware of those APs, and using SDN, they virtualised the private aspects of the network.

Residents named and secured their own virtual networks as if they had bought and plugged in a router in their own rooms. 

Scientists recreate life with first organism computer model

Scientists have recreated life with the world’s first computer model of a complete living organism.

Mycoplasma genitalium might be the world’s smallest free-living bacterium, but modelling every single molecular interaction has been a massive task.

Researchers playing God at Stanford University used data from over 900 scientific papers to create the computer model of the parasitic bacterium, opening the door for computer aided design in bioengineering and medicine.  

The final virtual cell model made use of more than 1,900 experimentally determined parameters, with computational models making sense of “enormous” amounts of data, according to the scientists.

This can kick start computer aided design using the model developed can now begin, and could even mean the “wholesale creation of new microorganisms”.

Bacteria or yeast could be used to mass produce pharmaceuticals, for example, or personalised medicine.  

The created bacterium is only 525 genes large, compared to E.Coli which is a tad more complicated at 4,288, but it seems the researchers will begin looking to model larger organisms too in the future.

That is not to say that we are in the realms of 80s sci-fi flick Weird Science just yet. The scientists say that even medicinal applications are a long way off, and it is going to take an effort on the level of the Human Genome Project to get close to a human model.

Intel throws $100 million at US universities

Intel has announced that it is to invest $100 million in university research in the US over the next five years, marking the latest in a series of global investments from the chip firm.

As part of the investment the company will establish Intel Science and Technology Centres in a number of universities in 2011. These centres will will aid Intel’s research and development in key areas such as visual computing, mobility, security and embedded technology.

The investment means than Intel will be forking over up to five times more funding in universities than previously, which will help cover operating, maintenance and staff costs.

Stanford University will house the first Intel Science and Technology Centre, with a focus on visual computing. The Sandy Bridge platform will be a major element of research here, particularly in terms of combined visual and 3D graphics.

Intel said that its decision to establish these research centres is part of a move to “a new model of collaboration.” They will be jointly led by university researchers and Intel staff and Intel is promising “maximum flexibility” in how they will operate, but may fine-tune the focus as its research aims change.

2011 looks to be a big year for Intel investment, with previous announcements this month of a $500 million investment in the Leixlip plant in Ireland and a $2.7 billion investment in the Kiryat Gat plant in Israel. With January not even over yet, Intel could have plenty more investment announcements up its sleeves.

Google trials 1GB fibre broadband service in US

Google has today announced that it is launching its own fibre-optic broadband service in the US called Google Fiber, starting with a trial run for residents of Stanford, California.

Google signed an agreement with Stanford University to build a super-speed fibre-optic broadband network for the Residential Subdivision of the university which amounts to some 850 homes belonging to faculty and staff members.

Residents of the area will be able to get internet speeds of up to 1Gb per second, over 100 times the speed most people are used to. To put that into perspective the UK government is hoping to get 2Mbps speeds for everyone in the country. 

Google is selecting new communities to roll out the service to, inviting residents anywhere to call on the company to pick their area. Google is hoping to provide its service for between 50,000 and half a million people, but if it really takes off, as it could, Google might decide to take on the big telcos and offer broadband to everyone.

The prospect of Google providing our internet service as well as being one of the dominant players on the internet itself raises a number of serious questions. With Google’s tarnished reputation from privacy abuses like the Street View fiasco, can it really be trusted to be an ISP? 

The deal Google made with Verizon a few months ago over net neutrality caused staunch criticism from every angle, including former allies in the net neutrality debate. Google will not have to worry about net neutrality, of course, if it’s the one calling the shots on services.

While it is unlikely that Google would risk the backlash that would inevitably arise from overtly abusing such a position, for example by blocking rival company websites on its network, Americans will need to decide if the superior speeds are worth handing Google that much more power. Call us paranoid. We know you are already.

The service should be available in early 2011.