Tag: 3D

Nvidia pushed out of the integrated graphics market

Sandy Bridge and Llano platforms from Intel and AMD will force Nvidia to exit its GPU business by the end of the year.

Beancounters at Forbes estimate that revenues from Nvidia’s integrated graphics fell below $400 million for 2011, and 2012 could see a complete exit.

At the moment, Sandy Bridge and Llanos are accelerated processor units and incapable of giving the high end cards a run for their money, but Nvidia has been making a bob or two from selling GPUs for the notebook and desktop segments.

Forbes has been looking at some figures and have come to the conclusion that integrated GPUs really are causing the death of discrete cards.

In desktops, the number of discrete desktop GPUs sold as a proportion of desktops shipments has fallen from an estimated 62 percent in 2007 to about 47 percent in 2011.

Normally Forbes would predict stable levels in the future but if APUs threaten this business, this figure could continue to fall.

In the worst case, only 35 percent of the desktop users will be using discrete GPUs in 2018 and notebooks will stay at 33 percent and not grow .

Forbes is warning if that is the case then Nvidia will  have to pack its bags and leave the discrete market completely. 

Flexible fibres to create bendable 3D displays

Tiny fibres which project varying amounts of light could be used to create flexible 3D displays, according to researchers at MIT.

3D TV technology might be generating a collective shrug among the public, despite continuing to take more of the LCD market, but flexible technology offers an innovative way to view 3D images.

That is because a team at MIT has created an ingenious 400 nanometre fibre that can send different light information to a viewer’s eye, in a similar manner to the varying images transmitted as part of a stereoscopic 3D image.

A hollow fibre is layered with alternating layers of materials with different optical properties, creating a mirror surrounding a drop of fluid in the centre.  When the fluid is charged with energy it is bounced around the mirrored interior of the fibre, and emits as light radiated from the core as a 360 degree laser beam.

The core is surrounded by four channels filed with liquid crystals which can change the brightness emitted from the centre.  These crystals can be activated separately to give control over what is emitted and in which direction. Adding a number of liquid crystal channels is very scalable, so information can be sent in a variety of ways.  It is also possible to produce long lengths of the fibres, up to kilometres, so we hope this means it could be woven into 3D jumpers at some point in the future.

With only one pixel being transited from each fibre, there are problems about how viable a working display would be. The team reckons that by getting the fluid droplet at the centre to move fast enough, it could fool a viewer’s eye into believing it is more than just one coloured point.

Even if this doesn’t work, there are possibilities to use the technology in medical applications, such as irradiating diseased tissue, as it could be threaded into narrow openings.

IBM's 3D qubit device brings quantum computing even closer

IBM says it has made a significant advance towards creating mind-bendingly powerful quantum computers with it superconducting 3D qubit device.

With experts hoping to construct a full working quantum computer in the next decade or so, the ability by IBM boffins to reduce errors in quantum processing and withhold information has made this even more likely.

Qubits, or quantum bits, are the basic units of information used to process information in a quantum computer.  Unlike ‘bits’ used in regular computers, which can switch between 0 and 1, quantum bits can be either 0 or 1.  Rather more perplexingly, the weird world of quantum mechanics means that they can also be both at the same time.

It is this ability to exist in multiple states that opens up potential for massively increased computing power.

To use one particularly brain-wrenching example of the power, most fast modern computers will allow a user to work on a small number of computations simultaneously. However, a single 250-qubit state could simultaneously contain more bits of information than there are atoms in the universe.

Such processing power is almost unbelievably faster than what is possible today, but with advances being made by IBM, the reality is almost within reach.

The number of qubits which are able to function is a lot smaller at the moment – just two or three.  But the possibility of putting these into a working computer has received a boost by reducing the number of errors in calculations made by previous attempts at quantum computers, and lengthening the time in which the qubits retain their quantum properties.

IBM scientists have now been able to achieve this with 3D superconducting qubits, which it believes are likely to work well in the transition to upscaling and manufacturing.

With this it was possible to get qubits to retain their quantum states for up to 100 microseconds, up to four times faster than what was previously possible. Crucially, this beats the threshold for enough time to allow for error correction in the qubits.

According to IBM, this leaves scientists with almost the minimum requirements for a full scale quantum computing system.

Intriguingly, the scientist reckon that we are now beginning to get out of the drawing board phase of development, with questions needing answering about processing demands for “error correction, I/O issues, feasibility, and costs with scaling”.

IBM says a practical quantum computing system is likely to be run on a “classical system” which can be connected to quantum computing hardware.

Mark Ketchen, manager of the Physics of Information group at IBM, tells TechEye that there is still a good way to go. “Our best guess is 20 years or more for commercially viable systems,” Ketchen said.

Ketchen believes that there are still significant hurdles to be jumped before we start thinking about full systems.

“On the quantum side, qubit metrics must still be significantly improved so that error correction can be implemented with practical overhead,” he said. “We are just now crossing the threshold where error correction can work. The overhead will be reduced significantly as metrics further improve. We have to perfect the technology to very rapidly read out the states of many of the qubits in parallel, also with a very low error rate and limited quantum noise. And we must figure out how to scale up.

“We are still at the level of a few qubits. How do we integrate many together? Hundreds, then thousands, perhaps a few million eventually, but billions are not needed such as the number of transistors in classical computing systems.”

Aside from the functionality of quantum processing there are also challenges in creating a working computer with the almost incomprehensible masses of data.

“On the classical side there are many difficult practical problems,” Ketchen said. “How do we package such a system at 15 milikelvin (mK) and handle the huge number of inputs and outputs to and from the quantum system? How do we handle the thermal engineering for such a large system running at 15 mK?

“A very special conventional computer will be required to operate the much more powerful quantum computer.” 

TSMC claims its 28nm process is better than Intel's 22nm

TSMC has been telling the world+dog that ARM chips made with TSMC’s 28nm are shedloads better than an Atom made on Intel’s 22nm FinFET process.

TSMC’s President for Europe, Maria Marced, told Electronics Weekly that ARM on TSMC’s 28nm gives better performance and power than Atom on Intel’s 22nm process. Chipzilla apparently missed the fact that it’s not just technology, it’s also the architecture.

She said that the term ’28nm’ or ’20nm’ does not refer to a gate-length it is merely a descriptor of a node. The quality of a node is no longer the number attached to it, but the characteristics of the devices made on it, she said.

While some punters were having yield problems with 28nm, Marced said that the defect density on 28nm is better than on 40nm at the same stage of product life.

She said that TSMC had 36 products on 28nm and compared with 40nm at the same stage, it  had three times the number of tape outs and three times the speed of the deployment.

Marced expected TSMC’s 28nm production to be 10 percent of the company revenues by the end of the year and it will be in risk production on 20nm soon too.

Its 20nm process will deliver 1.9x density improvement over 28nm, with 25 percent less power consumption and 15-20 per cent more performance.

Intel is betting on Finfets which are a nonplanar, double-gate transistor built on an SOI substrate and it is rushing to them on its 22nm designs.

However TSMC said that it will wait unto 14 nm before bothering with FinFets. Marced said that the egosystem was not ready for Finfets at 20nm, and TSMC didn’t want to delay 20nm.

While the rest of the world is worried about what is happening in Europe, TSMC is expecting strong growth from Europe this year. Europe delivers about nine percent of TSMC’s cash and it expects this to increase this year. 

From NFC to artificial brains: Future Horizons' future of chips

As well as looking at the state of the chip industry, IFS2012 saw Future Horizons give some predictions into the application of semiconductors over the coming years.

Starting with what the coming year is likely to have in store, CTO Mike Bryant gave his predictions stretching out into the almost unknown, twenty-odd years hence.

It is expected that Apple will finally get to stick some NFC chips into the iPhoneSamsung is also expected to push the technology this year, according to some IFS attendees.

As expected, Ultrabooks will pervade into mainstream consciousness, though, as Bryant points out, most will want until a generation running on Windows 8 appears later in the year before splashing out.  Windows on ARM is not expected to have much effect, though we imagine will grab headlines when released.

The TV market will carry on pushing 3D and, following the CES show-stealing, smart TVs will see a boost. Whether manufacturers will be able to meet the struggle for profits this year is another question though.  Apple TV rumours were also fuelled, adding to expectations that even larger needless shiny rectangles will make it into our lives by the end of the year.

Just as TechEye has been saying semiconductors will also make an assault on classrooms, with government backing products such as the RaspberryPi to promote computing and programming for a new generation.

Longer term, Bryant gave his predictions on the further evolution of the semi industry, with Intel leading the way with 15nm Trigate development, and larger production beginning possibly the year after.  450mm wafers will begin testing at its Albany site, with low scale production the following year.

Meanwhile TSMC and GloFo will fire up 20nm planar chip development, though whether more advanced process difficulties will be seen is unknown.

ARM’s global takeover, according to its bosses at least, will also begin with the big.LITTLE concept appearing  in its A15/A7 chip combination.

4G should finally arrive in the airwaves of Blighty in 2013 too, some years after others got their hands on it. Though it could be a while still before many people actually get to use it.

2014 should see some exciting developments with production of memristor technology, while 2015 could bring about Intel fiddling around with 11nm process Trigate chips.

By 2016 the LED lighting market should finally move into people’s lives, overcoming current cost issues with larger production levels cutting price, leading to a $30 billion industry by the end of the decade.   Work into self-powering devices should become mainstream this year too, beginning to open up the almost frightening possibilities of the Internet of Things.

Jumping to 2018 Intel will be producing 11nm chips in large scale as Moore’s Law begins to hit a slowdown to a three year cycle. Large scale 15nm production on 450mm wafers should see massive amounts of chips churned aiding to the ubiquity of semiconductors in our lives.

By this point it is entirely possible, Bryant says, that graphene circuits could be rivalling silicon with large scale production of chips based on the material.

Towards the end of the decade 5nm process devices could be demonstrated, with a convergence of memory and logic technologies allowing for the development of artificial brains.

From here on out the roadmap blurs into science fiction, but in the next ten to fifteen years work into 3nm processes should push Moore’s Law to its absolute limits, while we could finally be buying examples of the next step: quantum computing.

Implantable mobile phones will mean that it is truly impossible to be uncontactable, 24/7 telehealth monitoring could allow people to put their life in the hands of NHS IT staff, and quasi-intelligent robots will FINALLY begin to become available.

As for hoverboards though, it appears we will have to wait for IFS2013 at least before we find out.

Smart TV hype suggests Android home invasion

With this year’s CES open for business, it seems that one of the headline products is web-connected smart TVs.

Last year may have seen 3D sets begin to make their mark, but after notably failing to make the desired impact, attention appears to be turning towards smart TVs.

We have already seen LG talk up its Google 2.0 TV, while Lenovo has announced its own Android-based set too, and it’s expected that Samsung will enter the fray.  Apple has also been murmuring about its own web connected TV recently, where it hopes it will win on content.

A report from DisplaySearch also suggests that this could be a sign of things to come.  In many countries, replacing sets is largely pushed by consumer interest in web-connected TVs rather than 3D.  In North America, this means that the million smart TVs sold in 2011 will increase to 24.7 million by 2014.

Just as 3D has struggled to become ubiquitous in the home, it seems that smart TVs will also face some challenges.

While launching of a wide range of sets should see a large increase in numbers shipped over the coming years, the impact is not expected to be huge. As smart TVs become more sophisticated there is increasing functionality, like apps and full browsers.  Indeed, the Lenovo K91 almost blurs the line between TV and all-in-one PC.

In the early days, such features will put device makers in direct competition with themselves, as they sell more accessible products like tablets. It’s on mobile devices that apps and web surfing are more at home, rather than on a screen across the room.

According to DisplaySearch, by 2015 TVs with unlimited web browsing capacity will make up under 10 percent of those shipped in Western Europe.

But as smart devices like tablets and smartphones become even more commonplace in homes there’s likely to be more scope for them to work together.  Watching TV is more suited to larger TV sets, and it’s the ability to wirelessly connect with tablets and smartphones will make smart TVs more useful.

It could be argued smart TVs are a sign that we’re on the way to convergence in the home. Which begs the question – which operating system is going to win out, in the end? It’s hard to see Apple’s locked-in approach winning hearts and minds forever. The nature of Android on a range of devices offers consumers the ability to pick and choose while staying on the same OS, rather than Cupertino’s like it or lump it approach. It was always Samsung’s idea when it originally pushed BaDa.

Companies like Qualcomm are ploughing heaps of cash into R&D, convinced that there will be convergence – eventually tying in with the Internet of Things – where your smartphone will act as the central, personal device for everything else around you.

With some high-profile smart TV announcements made and more surely yet to come, we’re certain the battle for controlling all the screens in and out of your home is going to escalate. No wonder the patent cases are flying.

 

 

LG claims 3D TVs will have WiDi

Intel’s Wireless Display connection will be appearing under the bonnet of LG 3D TVs next year.

Apparently LG has signed a deal with Intel to ensure WiDi technology to feature in its 3D TVs next year. This connection enables the user to get content stored on laptops, smartphones and tablets to be transmitted and displayed on Smart TVs without the need for either Wi-Fi connection or external adaptors.

It creates a direct connection between chips on the device source and the display. So it means that you can use a notebooks and other mobile gadgets to connect directly to the LG Cinema 3D Smart TV, projector or monitor while saying away from the Internet Connection itself. It is aimed at streaming stored materials in the hard disk.

Good3DTV quoted the senior vice president of TV business unit at LG, Seog-ho Ro as saying that the deal with Chipzilla would help owners to enjoy the content in a more comfortable and stylish way.

3D television is still struggling thanks to the recession and a lack of decent content. TV makers had been hoping that pushing Internet connected TVs which did much more would cause the market to move a little. 2012 will see a range of new gadgets hammered onto 3D tellies in the hope of getting more punters interested.

Sony sued for not allowing customers to sue

Sony is being sued for forbidding PS3 customers from suing it.

After all the mess with the PlayStation, Sony got so miffed with being named and shamed in class action law suits that it installed a clause in its PlayStation Network’s End User Agreement which states that its users cannot sue.

Anyone who was annoyed about Sony’s insecure network or faulty products would have to go to arbitration which effectively would kill off any expensive class actions.

Ironic as finding a black fly in your Chardonnay, or getting a death row pardon several minutes too late, Sony has been sued in a class action.

It does not apply to those who signed up for a PS3 or PSN before the September update to the EULA. It claims that punters should not have to force users to forgo their rights in order to use the device that they purchased. In other words, they might not have bought the PS3 or the PSN if they had known that they would not be allowed to take out any class actions.

According to the Examiner, the lawsuit also claims that Sony buried the provisions, attempting to hide it from customers.

The PS3 EULA is 21 pages which is not online, and can only be read on the device itself. The ‘No Suing’ provision is placed toward the end of the document, where users are likely not to see it, and thus catch the end user in a trap.

If Sony loses then similar actions could be taken against Microsoft and Electronic Arts who saw what Sony did and said “I’ll be ‘aving some of that.”

LG rushes to become early WiDi adopter

LG and Intel have inked a deal to use Chipzilla’s WiDi technology in LG’s top of the range tellies, but it could struggle to see widespread adoption despite the benefits for ultrabooks.

The pair recently announced a “strategic alliance” to promote the use of Intel’s Wireless Display (WiDi) technology, which provides wireless connectivity for HD content stored in notebooks and other devices.  A recent statement announced that LG’s Cinema 3D Smart TVs would be the first to feature Intel’s wireless, clutter-reducing gear.

The internal WiDi system will mean content can be streamed directly to a TV set, or potentially to a projector or monitor, without needing miles of HDMI or VGA cables. WiDi doesn’t require any internet connection, just a WiDi enabled laptop and screen to stream content.  

On the face of it the system certainly sounds useful, with consumers increasingly accessing content on the big screen from a laptop. Despite Intel and LG’s gusto, there is not quite as much enthusiasm from other corners.

Paul Gray at DisplaySearch believes that despite the joint venture showing “some possibilities” but there are many problems in widespread adoption.  This is at least partly due to LG’s history of quickly picking and dropping projects.

“With other manufacturers you might take it more seriously,” he told TechEye, “but if you look at LG it often jumps on new technology like this, and just because they have made an announcement doesn’t mean that they will push it.”

Gray believes that there are many problems with LG and Intel’s WiDi, which has been attempted in various guises by other firms, too: “I have had reports of terrible latency problems, with a video delay of a couple of seconds, which means that it would be pretty bad for gaming,” Gray said.

Sony has also attempted similar techonology, and as Sony tend to do, it was very well developed.  However it also cost a lot and the public is not interested in spending so much money on getting rid of some wires. Basically it is incredibly expensive to do it properly.

“In this sense WiDi is not a game changer, though it will be interesting to see who manages to lead on the technology in future.  At the moment thought it has limited applications other than in business use.”

One area which Gray flags as useful is doing away with bulky and expensive socket components in laptops.

This fits in line with Intel’s Great Light Hope: the Ultrabook. Intel has already been decking out its Ultrabooks with WiDi, and it certainly makes sense with the two features that Intel is so desperate to reduce: size and, more importantly, cost.

“Intel can get rid of the outputs on its devices, and that means not having to put in expensive socket components that take up a lot of space,” Gray says.  “This could be very good for Intel.”

Dropping unneccessary baggage makes sense as Intel is betting the farm on affordable Ultrabooks. Whether LG is the right partner to bring WiDi into living rooms across the world is another question.

The WiDi-enabled LG set will be on view at CES in Lost Wages next month. 

Semi industry worried about ever-smaller processes

With moves to ever-smaller semiconductor manufacturing processes, problems arise as engineers attempt to cram even more transistors onto chips.

According to DigiTimes industry players are already concerned about the progression to smaller processes.  Intel and TSMC for example are expected to begin producing chips which are below the 20 nanometre barrier in the next couple of years, and it will not be long before there are production challenges.

Eastern sources believe in the next few years equipment costs for manufacturers could cause headaches for cash strapped firms, as next gen production methods using extreme ultraviolet lithography tools cost around $100 million a pop.

The technological bottlenecks could be the most significant problem, with TSMC already working on 14nm chips, with risk production due in 2014.

Luckily many labs are looking at developing alternative technologies which could eventually supplant silicon as the ubiquitous semiconductor material.

IBM recently highlighted some of its weird and wonderful attempts to solve the problem creating even smaller chips, and development with super material graphene continues at a mesmerising pace.

But graphene is not the only material which could power a new breed of computer. Researchers at Harvard and Purdue universities in the US have continued work into 3D chip structures which are set to become common in chips next year.

Using microscopic nanowires made from indium-gallium-arsenide, which has a faster electron flow than silicon, the researchers are looking to create an alternative for further 3D gate designs.

According to the team it is difficult to use silicon for smaller process 3D chip design and believe that indium-gallium-arsenide could do the trick.  They reckon that a 3D device developed with indium-gallium-arsenide could be up to five times faster in conducting electrons than silicon, and with a dielectric coating of aluminium oxide power consumption could also be significantly lower.

Due to the ‘top-down’ method for production that is already common in the semi manufacturing industry the researchers think that there should not be too much problem producing a new breed of chips either.