Author: Matthew Finnegan

Scientists uncover gene responsible for beer-foam

Scientists are on the way to creating the perfect head of foam on a beer after uncovering the gene responsible for creating a frothing pint.

It is the proteins in wheat and yeast used in beer production that determine the quality of the foam, with carbon dioxide gas produced in fermentation. According to the beer-sipping scientists, proteins gather around the gas produced by the yeast, stabilising the foam and stopping it from disappearing, like a pint of cider generally tends to.

However, no one actually knew which yeast gene was responsible for creating the foam-stabilising protein, or until now at least.  

Scientists have recently published a paper in the ACS’ Journal of Agricultural and Food Chemistry claiming to have revealed the identity of the magical foam-giving gene, CFG1.

CFG1 stands for ‘carlsbergensis foaming gene’, carlsbergenesis being the yeast originally discovered by an employee for the Danish brewery Carlsberg.

According to the reports cited in the discovery “opens the door to new possibilities for improving the frothy ‘head'”.

The advancement could have differing impacts depending on where the technique is employed.   While most beer drinkers in Europe prefer a decent head on a beer, in the UK bar staff pouring a few extra millimetres of foam are likely to be greeted with dirty looks and an angry demand for a top-up. 

Nevertheless it should see barflies across the world raising a glass to the wonders of genetic science if the research is successful in creating a better beer.

Better power amplifiers could double mobile battery life

Power drain on future mobile devices could be reduced by half using a more efficient power amplifier which is being developed by an MIT spin-ff company, Eta Devices. 

At the moment, mobile phone chips that turn electricity into radio signals are rather wasteful, providing only around 35 percent efficiency. This means that the majority of energy used to send information is lost to heat, draining battery at a swift pace.

According to TechnologyReview, MIT researchers at Eta Devices say that they have solved the problem by more intelligently determining the need to increase the power needed to send a signal. 

At the moment, power amplifiers use transistors that operate at two levels – standby, and output signal mode for when sending a signal.  However, the standby mode voltage is generally kept high, as big jumps from low to high power can distort the radio signal, and this creates large demands on power, depleting battery life.

The advance is essentially a “blazingly fast gearbox”, the researchers say, and is able to choose among different voltages that can be sent across a transistor used in the chip. This is done up to 20 million times a second, selecting the voltage that minimises power drain.

This method, called asymmetric multilevel outphasing, will help even when receiving a call, or downloading a video on smartphone for example, as the amplifier is busy even at this point, sending out receipts of packets.

Overall this could help double the efficiency of the power amplifier from the 35 percent achieved in most handsets.

The technology is still at the lab stage, but commercialisation, starting with LTE base stations, is expected to begin in 2013.

Scientists create first all-carbon solar cell

Researchers at Stanford University have developed a solar cell made entirely of carbon, offering a cheaper alternative to the current standard.

Solar cells that typically use more expensive materials such as indium tin oxide (ITO) – also used in smartphones, LCD screens and many other applications – and are known as rare earth minerals. 

Now, a team of researchers at Stanford University have come up with an alternative method of production by creating the first solar cell made entirely of the abundant resource, carbon.

The cell consists of a photovoltaic layer sandwiched between two electrodes. Typically these electrodes would be made of conductive metals like ITO, but the researchers used atom-thick material carbon-based material graphene and carbon nanotubes, essentially rolled nano-scale sheets of carbon.  

According to the researchers, carbon nanotubes have significantly better electrical conductivity and light absorption properties and would allow for easier production than conventional cells.

However, the efficiency is not able to reach the levels of conventional cells – currently only able to reach conversion rates of one percent.

The team is confident that this can be improved, and is looking at ways to increase the efficiency by, for example, creating smoother layers of material to make it easier to collect current.  

One of the problems is that the prototype device absorbs wavelengths of light in the near to the infrared spectrum. But if the researchers are able to find ways of making the carbon nanomaterials target wider wavelengths, and on the visible spectrum, this could improve efficiency.

Even if it is not possible to massively increase the efficiency of cells there should be some uses. Carbon, which forms super-strong diamonds, is a resilient material and can remain stable at high temperatures where other cells would stop working.

“We believe that all-carbon solar cells could be used in extreme environments, such as at high temperatures or at high physical stress,” Michael Vosgueritchian, one of the researchers, said. “But obviously we want the highest efficiency possible and are working on ways to improve our device”.

Metamaterial tech threatens the end for power cables

Oxford University research into metamaterials could help the public do away with the masses of wires that connect computing devices.

Isis Innovation, the research commercialisation arm of the university, has come up with new technology that allows devices to both charge up batteries and transit data without the need for any cables.

It is already possible to charge devices using inductive charging, with devices from electric toothbrushes, but the team have struck on a new way to deliver power and data.

Using engineered metamaterials, the team says it is possible to connect devices with a patterned conductive layer that can be added to pretty much any surface. This means it is possible to use a carpet to provide power to a lamp, for example, and the same should apply to all manner of devices.  

Dr Chris Stevens, one of the researchers, said: “You could have a truly active, cable-free, batteryless desktop that can power and link your laptop or PC, monitor, keyboard, mouse, phone and camera.” 

This could mean putting the technology behind a computer monitor’s screen to transfer digital files to and from a USB stick “simply by tapping the flash drive against an on-screen icon”, with touted transfer speeds of 3.5 gigabits per second.

In the future it could be possible to have your stereo, TV, DVD and satellite box all powered through the carpet and wallpaper. An electric car could also be charged via a mat, for instance.

By doing away with power cables components could be easier to recycle. As it stands, devices are soldered or wired together and so are difficult to recycle. Stevens claimed that by getting rid of wires and connecting components by putting them on a sealed circuit board it will be a lot easier to take them apart without desoldering or using heat treatments which could potentially damage components.

“High spec computers can be sent back to the manufacturer when the next model comes out and the processors can be reused for lower spec home computers,” he said.  “Eventually those same processors can end up in TVs and washing machines – dramatically increasing the lifecycle of electronics”

4G to give mobile operators complexity headache

Network complexity will cause mobile operators problems next year as the number of technologies used to meet heavy data demand increases.

According to Actix, which provides companies with mobile network analytics, operators are likely to have difficulties in integrating the deployments of 4G and small cells, used for 3G data offloading, alongside existing 2G and 3G systems.   

This means that costs could be raised and impact negatively on customers experience as operators expand services to deal with growing demands on networks, with customers losing calls, and operators subsequently losing customers.  Switching between 2G and 3G technologies can potentially cause problems with connections if networks are not managed effectively.

150 mobile operators are expected to roll out LTE offerings in total next year using the networks set up by the big players. Meanwhile the number of small cells used by operators is expected to rise as they replace macro cells.  

This means that mobile operators are likely to be using at least two vendors to access the four technologies, as well a range of cell sizes, such as macro, pico, metro and femto in order to boost connectivity.

Subsequently, resources could be stretched thin, putting extra demand on tools, processes and staff.  According to Actix, which spoke to 400 of its mobile operator clients, there is concern that manually combining networks will no longer be viable, and there is a move towards a heterogeneous network to manage systems.  

Bill McHale, CEO at Actix, said in a statement that operators will have to scale out their activities through making use of customer insight and network analytics, as well as multi technology optimisation.

He added that with 4G on the way, and more and more tablets hitting the market, operators need to “get this right, or risk losing subscribers”. 

Mobile internet access to exceed PC by 2015

The number of people accessing the internet through mobile devices will exceed PC users by 2015 in the US, with western Europe not far behind.

According to analysts at IDC the number of people using the internet via a PC in the US will shrink from 240 million in 2012 to 225 million in 2016.  Meanwhile mobile users will increase from 174 million to 265 million over the same period.

This shift is to be followed in other parts of the world, with western Europe following suit a couple of years later.

As smartphones become increasingly commonplace it is likely that the situation will be the same if not moreso in emerging markets as the transition is made from feature phones, while PC sales stay lower. 

IDC says that this shift will coincide with mobile advertising across the world rising from $6 billion in 2011 to $28 billion in 2016. 

However, this shift is likely to have an impact on overall advertising revenues. Many companies have struggled to generate cash from online advertising despite generating large revenues from PC use. 

Facebook is one of the companies which has struggled to really effectively monetise mobile advertising, as reflected in its dropping share value.

It is expected that the 66 percent of Facebook users visiting the site through a PC in 2012 will drop to 52 percent in 2016.

AMD and ARM team up for server chips

AMD has branched out from the x86 platform, announcing that it will begin production of server processors using ARM designs in 2014.

AMD will produce 64-bit multicore SoCs as new approaches are explored to deal with growing cloud and data centre demands, offering a lower power alternative to the x86 architecture.

The first server processors will feature SeaMicro Freedom supercompute fabric, following AMD’s acquisition of SeaMicro in March, and will allow thousands of processor clusters to be linked together.

ARM CEO Warren East said the combination of the two companies’ technology could help transform the market.

“The industry needs to continuously innovate across markets to meet customers’ ever-increasing demands, and ARM and our partners are enabling increasingly energy-efficient computing solutions to address these needs,” East said in a statement. 

Lower power ARM-based servers are increasingly enjoying popularity, and major vendors are keen to be involved in the development of more energy efficient technology with better performance per watt.

As part of the announcement, OEM partners showed that they were committed to developing in the data centre, with HP, Dell and Red Hat heaping on the praise for more flexible approaches to server hardware. 

The announcement by Intel’s two main rivals is a sign of underlying changes in the chips industry. While x86 has been traditionally seen as a the mainstay of PC and server computing, the lines are beginning to blur, with Intel attempting to get its chips into a growing number of tablets and smartphones, and ARM moving into server chip designs.

Data centres are demanding lower power chips – so AMD’s partnership with ARM is a real sign of change, and could well threaten their main rival.

Amazon slams iPad Mini in public ad

As competition between tablets makers hots up, Amazon has thumbed its nose at the iPad Mini in its latest advert for its own 7 inch tablet, picking apart claims by Apple.

In a new advert for the Kindle Fire HD, Amazon puts its own tablet up against Apple’ newly launched devices in full-on sparring mode, under the banner “much more for much less”.  

Amazon didn’t miss its chance to highlight one of the main differences between the two devices, namely the price tag, proudly displaying $199 in large type for the Kindle Fire HD, compared to $329 for the iPad Mini.  

As well as claiming that the Kindle Fire HD has 30 percent more pixels than the slightly larger iPad Mini, Amazon took a swipe at its rival by posting a comment from gadget news website Gizmodo: “Your [Apple’s] 7.9 inch tablet has far fewer pixels than the competing 7 inch tablets! You’re cramming a worse screen in there, changing more, and accusing others of compromise? Ballsy.”

The advert claims that the Apple device cannot handle HD pictures as its own can, and also makes some noise about superior wi-fi connectivity and improved speakers.

Of course, Apple is guilty of its own name-calling as the fight for share of the smaller tablet market gets increasingly catty, proving that it is more than up for a bit of sniping at its competitors.

At the launch of the iPad Mini, the Cupertino company addressed the challenge that Apple faced from competitors which have already released 7 inch tablets such as Google and Amazon.   Marketing boss Phil Schiller contemptuously described its rival’s efforts as having “failed miserably” in producing a tablet equal to those produced by Apple.

Schiller levelled criticisms at its competition, throwing in barbs about screen size and bezels, lamenting the woeful attempts to chip away at the iPad’s dominance.   

However, we imagine Amazon bosses were rubbing their hands when they subsequently announced that – on the day after Apple’s release – the Kindle Fire HD achieved record sales, according to AllThingsD.

World's smallest 4k screen unveiled

While Sony and LG have recently announced gigantic 84 inch 4k TVs, one manufacturer has revealed its plans for releasing a 9.6 inch 4k screen.

4k, also beginning to be known as UltraHD, is so named after its enormous resolution 3840×2160 resolution. So far, manufacturers have strongly veered towards the larger end of the spectrum, and manufactures are charging equally large prices for the nascent technology.  

Both Sony and LG Display have announced 84 inch TVs using the technology, and both come with price tags that break right through the $20,000 mark. Panel manufacturer AUO has also announced it will be bringing out 65 inch and 55 inch panels. 

But Japanese manufacturer Ortus Technology  has taken a different approach, producing a 9.6 inch screen, though they will not be made for consumer electronics.

At 458 pixels per inch (ppi) it is way ahead of the likes of Apple’s Retina Display, with the iPhone 5 reaching 326 ppi, though the manufacturers note that such resolutions “exceed [the] discrimination limit of human eyes”.  However, the viewing angle is lower than an iPad, with the 4k TV device managing 160 degrees compared to Apple’s 178.

The screens, which are ready to ship in November, are not targeted at mainstream use.

According to the manufacturers, the screens are more likely to be used in applications such as professional video equipment or medical equipment, where such resolutions are either a strict requirement or highly beneficial. This could mean, for example, use as a 4k TV camera monitor screen.

US companies fail to disclose use of conflict minerals

US electronics companies are failing to to disclose their usage of conflict minerals ahead of the introduction of a federal law – with 90 percent of companies yet to produce the necessary data.

Following a ruling by the Securities and Exchange Commission (SEC) in August, electronic component manufacturers have been told to be more transparent about the sourcing of the materials they use.

The ruling is part of attempts to clamp down on the use of minerals such as tin, tungsten, tantalum and gold, which are mined in regions where armed conflict and human rights abuses take place.One of the most well known is the resource rich Democratic Republic of Congo.

These minerals often end up in components used in all manner of consumer devices, from PCs to smartphones.  However, consumers are generally unaware that revenues eventually make their way up the supply chain into the hands of those controlling the mineral production, effectively supporting human rights atrocities.

As the public outcry against labour conditions in Foxconn factories, used to manufacture Apple products, has shown, there is consumer interest in ethical production.  But many companies have failed to provide clarity about the minerals they source for their components.

Nintendo, for example, was marked in arecent report as one of the worst for investigating its own supply chain.

Authorities in the US are now pushing publicly traded manufacturers to not only disclose the origins of minerals used in production, but to actively investigate their supply chain.

As part of the SEC ruling, they now have 21 months to comply with the legislation. A survey from IHS iSuppli has shown that only a small proportion have made any headway.

Only 11.3 percent of electronics components manufacturers were found to have disclosed conflict mineral information. These companies accounted for just 17.1 percent of the market, meaning that the vast majority of data on the use of conflict minerals is unavailable.

According to Sasha Lezhnev, at the Enough Project, a US group which conducts research in conflict areas such as the DR Congo, believes it is “disingenuous” to say that electronics companies are unprepared to take action against conflict minerals.

Lezhnev says that companies such as Intel and HP have known about the problems surrounding conflict minerals and have taken action to address supply concerns. Apple fully identified all of the smelters in its supply chain in six months, for example.

“It takes some effort, but frankly we’ve seen that it’s not that difficult for companies to do this work,” Lezhnev told TechEye. “So it is time for the laggard companies to stop making excuses and get on with these reforms”.