Tag: computing

Thin clients shipments get slimmer

HP logoIDC reported that the thin and terminal client market fell in the third quarter of 2015 by 6.7 percent.

Budget contraints and a move to use repurposed PCs and Chromebooks affected sales in the third quarter.

IDC said its overall forecast for the year will fall by over six percent compared to 2014.

But the market research company believes that shipments will grow between next year and 2019.

HP kept its lead in the market, with 26.9 percent. But hot on its heels was Dell with a 26 percent share. Ncomputing, Centerm, Igel and others made up the total figure.

HP’s share fell by 8.2 percent in the quarter, while Dell fell by 10.6 percent.

Amdahl's law needs to be taken seriously

A computer researcher is close to rewriting Amdahl’s law, which is a theory that focuses on performance relative to parallel and serial computing.

In a presentation entitled “Breaking the Law” at the International Supercomputing Conference this week in Leipzig, Germany, Thomas Lippert will reveal how he thinks the law can be adabted.

For those who came in late, Amdahl’s law was worked out in 1967 by noted Big Blue boffin Gene Amdahl who just happened to have the same name as the law. It is supposed to explain the limitations of parallel computing based on certain models.

The theory states if you chuck enough hardware in a parallel set up you can solve a problem but you still have some in serial, which is a limiting factor in speeding up tasks.

The maths assume there is a limit to parallel speed-up, assuming some things are constant, such as the problem size and the nature of the processors doing the computation.

Amdahl’s law was re-evaluated by John Gustafson who came up with Gustafson’s law which said that if problem size is not constant and parallel computer speed can scale up accordingly.

Lippert was looking at experiments done as part of the DEEP Project, which investigates highly parallel computing models that help speed up supercomputers.

That project involves building high-performance systems called JUROPA (Jülich Research on Petaflop Architectures) and was set up by Bull, Partec and Intel.

“This machine is ideal for highly complex problems that exhibit a lower concurrency, in general. Most codes live somewhere in between. I want to find out, if we can bring the concepts together. The different architectures can assign the … different code parts according to the concurrency”, Lippert said.

Writing in his blog, he said that performance in supercomputers has scaled thanks to new programming models and hardware such as accelerators and graphics cards.

“Code needs to be structured according to concurrency levels, such as in programming languages like the one provided by Barcelona Supercomputing Center’s OmpSS,” Lippert said.

Despite the title of the presentation, the aim is not to challenge Amdahl’s law, Lippert said.

“On the contrary, I think, we are not taking Amdahl’s law seriously enough. It is simply obvious that we should adapt the right piece of hardware to the corresponding concurrency,” Lippert said.

“Only this approach has the potential to be most energy efficient and performance oriented at the same time.” 

Exascale computing ready in a decade

Intel fellow Shekhar Borkar has told the Semicon West fab tool vendor tradeshow that exascale computing will become a reality before the decade is out.

But he said that while the technology will be possible, thanks to parallelism and technology scaling, it will not meet its full potential.

According to EETimes, Borker said that the problem is that exascale computing needs to overcome power consumption barriers.

He said that by about 2018, engineers are expected to create an exascale supercomputer, capable of a 1,000-fold performance improvement compared with today’s petaflop systems.

But an exascale computer will consume vast amounts of power, according to Borkar, and the real challenge will be to build one which consumes only 20 megawatts (MW) of power.

If they manage this then giga-scale systems consuming only 20 milliwatts of power can be used in small toys and mega-scale systems that consume only 20 microwatts could be used in heart monitors. 

Raspberry Pi sells out in minutes

The entire stock of the Raspberry Pi sold out within a few hours of going on the market.

The device is being touted as a new way to train the next generation of computer programmers and is about the size of a credit card and costs £22.

The first batch of 10,000 units have all gone and the sites trying to sell them crashed with the demand. Distributor RS Components said that tens of thousands of people tried to order one and this was the greatest level of demand the company had ever received for a product at one time.

The company gets its first batch of devices from China next week and the Raspberry Pi Foundation, a charity, limited sales to one per customer to avoid scalpers and to “get to as many people as possible”, it said on Twitter.

The aim of the gear is to put the Raspberry Pi in the hands of every kid in the United Kingdom. Its only rivals are the Beagleboard and Pandaboard which are $150 and $180 respectively. Raspberry Pi’s costs have been kept low because of goodwill from suppliers and because its organisers aren’t paid by the charity. 

NHS IT debacle debated in parliament

MPs have waded into a debate about the state of the NHS computer systems.

And although there have been no changes suggested at the moment, MPs have brought up some interesting points surrounding failures and escalating costs.

The first speech came from Richard Bacon, a Conservative MP for South Norfolk who began the debate by talking about the national programme for IT in the health service. He said it was the “largest civilian computer project in the world” and was “spawned in late 2001 and early 2002.” by then Prime Minister, Tony Blair. He had a little tete a tete with Bill Gates and was “bowled over” by a vision of what IT could do to transform the economy and health service.

The idea was for information to be captured once and used many times, transforming working processes and speeding up communications.

Under the magic system hospital admissions and appointments would be booked online—the choose and book system; pharmacists would no longer struggle with the indecipherable handwriting of GPs; and drug prescriptions would be handled electronically. There was to be a new broadband network for the NHS, a new e-mail system, better IT support for GPs and digital X-rays.

However, Mr Bacon pointed out that most important of all, medical records would be computerised, thus transforming the speed and accuracy of patient treatment through what became known as the NHS care records service.

And the idea seemed so good to Mr Blair that he apparently rushed ahead in 2002 with promises that it would all be implemented by 2005. 

However it didn’t run smoothly. The project’s estimated whole-life costs were £5 billion and it provided a total risk score of 53 out of a maximum of 72. In other words, the project was very high risk. It was also estimated that the cost of the IT programme would be £2.3 billion.

However, to ensure it clawed back money Blair’s boys decided that contractors would not get paid until they delivered, and those not up to the mark would be replaced.

With this in mind you’d think no-one would want to play ball but in May 2003, potential
bidders were given a 500-page document called a draft output-based specification, and told to respond within five weeks. However there were again difficulties with what exactly the government wanted to achieve meaning that a set of contracts that were signed before the Government had understood what they wanted to buy and the suppliers had understood what they were expected to supply.

The lucky four bidders- and we say that loosely – were Accenture; Computer Sciences Corporation, or CSC; Fujitsu and BT. They were known as local service providers, or LSPs. BT and Fujitsu picked a US software firm, IDX, to work with, while Accenture and CSC both picked a British software company called iSoft. iSoft who offered a software system called Lorenzo, a program that had “achieved significant acclaim from healthcare providers”.

However, the program was not finished.

That caused a big headache for Accenture, the biggest LSP, with two contracts worth around £1 billion each. It was in partnership with iSoft and was trying to implement software that was basically not implementable. CSC faced a similar problem in the north-west and of course under the terms no one was supposed to get paid until something was delivered. As iSoft had not produced a working version of Lorenzo, the reality was that neither Accenture nor CSC had any software to deploy.

Accenture and CSC struggled on with the unusable Lorenzo but in 2006 they admitted the software had “no mapping of features to release, nor detailed plans.” In other words it had failed.

In March 2006, Accenture announced to its shareholders that it would use $450 million to cover expected losses on the programme. It made repeated offers to the programme that it would meet its contractual obligations by using other software. However, that might have bankrupted iSoft. The government wasn’t having any of it however, responding with a threat when Accenture talked about walking away.

And it was also bad news for CSC, with its own £1-billion contract for the north-west and west midlands regions, in no better a position than Accenture to implement the unfinished Lorenzo software. It was also struggling to mop up after having caused the largest computer crash in NHS history,

Finally, iSoft was forced to throw in the towel and declare a loss of £344 million, which wiped out all the company’s past profits.

CSC had to continue on its own, while the other two providers, BT and Fujitsu, were having their own problems. They were trying to implement American software, which is not such an easy thing to do in a British hospital, because American hospitals rely on billing for each and every activity and do not, conversely, expect to have to handle waiting lists.

The final report was delayed again and again, and it finally appeared in June 2006 leading critics to claim that the NHS would most likely have been better off without the National Programme in terms of what was likely to be delivered and when.

John Pugh a Liberal Democrat for Southport backed up these claims, telling MPs, “the project would not have done well in front of Alan Sugar on “The Apprentice”, let alone the Public Accounts Committee.”

He added that the government had to find £20 billion within the health service and that costs, particularly those of the patient administration systems, are still being picked up hospitals.

Jackie Doyle-Price Conservative for Thurrock however pointed out that the project had always been “over-ambitious”.

She added that “we would all agree that it has been poorly led and ineffectively delivered” and the cost had escalated considerably.

Sir Maurice Wilkes dies

Sir Maurice Wilkes, has died at the ripe old age of 97.

Known as the “father” of British computing, Sir Maurice was best known as the designer and creator of the Electronic Delay Storage Automatic Calculator (Edsac), a computer that ran its first program in May 1949.

This was very instrumental in the British computer industry as it was the first widely-useable stored program machine.  It set standards for how computers should be used in academia and business that have lasted until the present day.

Speaking about this creation, Sir Maurice said in an interview last year: “We had vision. We saw computers as becoming important in the world, not just for mechanical calculations, but for business. But all we had was vacuum tubes.

“We couldn’t possibly have had any premonition of transistors and integrated circuits, and that’s what’s made the difference. Integrated circuits have given us speed and low cost and so on, but the central thing is reliability. Even if you don’t use them very often, they still work.”

However, it wasn’t his only innovation, with Sir Maurice continuing to pioneer computing. In 1951 he set to work on  developing the concept of microprogramming. This was derived from the realisation that the Central Processing Unit of a computer could be controlled by a miniature, highly specialised computer program in high-speed ROM. The results of his work meant that CPU development was greatly simplified.

The next computer for his laboratory was the Titan, a joint venture with Ferranti, which eventually supported the UK’s first time-sharing system and provided wider access to computing resources in the university, including time-shared graphics systems for mechanical CAD. One main feature of this computer was that it could provide controlled access based on the identity of the program, as well as or instead of, the identity of the user.

It introduced the password encryption system used later by Unix. Its programming system also had an early version control system.

Rightly so, his work has been recognised throughout the industry. He was awarded the Turing Award in 1967, the Faraday Medal from the Institution of Electrical Engineers in London in 1981 and the Kyoto Prize for Advanced Technology in 1992. He was also knighted in 2000.

The "ten commandments" of computer ethics get Taiwanese treatment

When Ramon Barquin presented a list of “ten commendments” for computer ethics in an ethics conference paper in Washington in 1991, he hoped that the text would become an effective code of ethics for the proper use of information technology.

Over the next 15 years, the text was translated into a dozen languages, including simplified Chinese for readers in communist China, but it was never translated into complex Chinese characters for readers in Taiwan. Until now.

You see, China and Taiwan are separated by more than the choppy waters of the Taiwan Strait. Mao Zedong decreed that China use a simplified writing system for its billions of people, and the system in place is called Simplified Chinese Characters. Taiwan, being a different country with a different history, still uses the traditional characters of the ancient Chinese writing system, and the characters used here are called, yes, Complex Chinese Characters.

Enter Jason Chang, a graduate student at Chung Cheng University in southern Taiwan, who volunteered to do turn the ten commandments into “real” Chinese for the U.S. website.

A second year student in the master’s progrram in the university’s department of medical information data management, Chang told TechEye that he had heard about the U.S. website that hosted various translations of the original text and volunteerde to send in a version in complex Chinese characters — which had been overlooked the previous 15 years — representing Taiwan.

The “ten commandments of computer ethics” were first presented in a paper written by Barquin, president of the Computer Ethics Institute in Washington and a former IBM executive who runs his own consulting firm.

Barquin, like most Westerners, was unaware that there was a difference between simplified Chinese and complex Chinese characters as used separately in Beijing and Taipei. But when he learned of the differences, Barquin said he would be happy to have someone in Taiwan add a separate translation for readers here.

Among the “commandments” listed by Barquin are: “Thou Shalt Not Use A Computer To Harm Other People”; “Thou Shalt Not Interfere With Other People’s Computer Work”; and “Thou Shalt Not Snoop Around In Other People’s Computer Files.”

Want more?

  • Thou Shalt Not Use A Computer To Steal
  • Thou Shalt Not Use A Computer To Bear False Witness
  • Thou Shalt Not Copy Or Use Proprietary Software For Which You Have Not Paid
  • Thou Shalt Not Use Other People’s Computer Resources Without Authorisation Or Compensation
  • Thou Shalt Not Appropriate Other People’s Intellectual Output
  • Thou Shalt Think About The Social Consequences Of The Program You Are Writing Or The System You Are Designing
  • Thou Shalt Always Use A Computer In Ways That Ensures Consideration And Respect For Your Fellow Humans

It’s not Moses. And we’re not atop Mt. Sinai. But Barquin’s words have travelled far and wide in a Babel of languages, and now they’ve come to Taiwan, too. Food for thought.

Is this ethics code for the IT era a good idea? “Thou Shalt Not Copy Or Use Proprietary Software For Which You Have Not Paid” is probably going to be around for a good long while, no?

Mobile computing market continues to grow

Despite turbulent economic conditions, mobile computing continues to see surging demand
according to a new report by In-Stat.

The company has said this has resulted from sleeker designs, new form factors, and pent-up
business demand. According to In-Stat mobile computing devices, including tablets, netbooks, smartbooks and laptops will grow at 19.1 percent through 2014 and account for over 400 million units.

Jim McGregor, chief technology strategist said: “While there will be a battle for the lower-end internet-centric devices like tablets and netbooks, notebooks will continue to be the overall demand driver as consumers focus on lighter and lower-cost PCs and businesses continue to transition to mainstream and high-performance mobile platforms.”

“In addition, demand for mobile computing is coming from both developing and industrialised regions.”

In addition, tablets will record the highest growth of 123.6 percent through 2014. It said laptop shipments will reach 291 million units in 2014 and account for 52 percent of the computing market, while the Asia Pacific will lead all regions in growth surpassing 36 percent of the total market in 2014.

UK's IT sector may suffer as students ditch computer courses

The IT sector in the UK may fall behind as more and more students ignore inadequate computer courses, according to the Joint Council for Qualifications at the Royal Society.

It found that there was a 33 percent drop in students sitting ICT GCSEs between 2006 and 2009, with a similar drop for ICT A Levels between 2003 and 2009. It even found that there was a whopping 57 percent drop in Computer A Levels between 2001 and 2009, suggesting that more and more people are ignoring what was once a popular industry to break into.

The Royal Society believes the reasons for the decline are poor design and delivery of curriculum, which impacts upon how students perceive these courses, how much they get out of them, and how much fun they have while on them. In effect, they are simply not engaging enough. TechEye‘s younger staffers remember ICT courses being a bit of a doss.

Professor Steve Furber, Fellow of the Royal Society, who led the research, was laudatory of the UK’s history as a leader in IT, but he warned that it is falling behind as a result of poor courses. 

“If we cannot address the problem of how to educate our young people in inspirational and appropriate ways, we risk a future workforce that is totally unskilled and unsuited to tomorrow’s job market,” he said.

A number of big IT companies, such as Google and Microsoft, are participating in the research, keen to encourage uptake among computer courses in order to ensure a skilled workforce for the future.

“At a time when computers are playing an ever more important role in our work and everyday lives, we should be able to encourage more, not fewer, students to learn how to create with technology. It’s a sad loss that we’re missing this opportunity,” said Dr. David J. Harper, head of university relations at Google EMEA.

Students in the UK may be flaunting their iPhones and iPods, but that won’t help them when they enter the workplace with little to no qualifications.

Semiconductor revenue to reach $344 billion by 2014

Global semiconductor revenue is expected to reach $344 billion (£226 billion) by 2014, according to new figures released today by International Data Corporation (IDC).

IDC forecasts that revenue for semiconductors will be up to $274 billion (£180 billion) this year, $295 billion (£194 billion) in 2011, and jumps to $344 billion (£226 billion) by 2014, which means a compound annual growth rate of 8.8 percent from 2009 to 2014. This is very good news for the industry, which saw a woeful 9 percent drop in revenue last year as the recession ate into most sectors’ profits.

The outlook is even better for the computing industry, with an expected compound annual growth rate of 12.2 percent over the same five year period. The reasons for such strong growth include high demand for PC semiconductors, which have seen growth of over 35 percent year on year. Mobile PC applications and enterprise upgrades over the next few years are also expected to aid in pushing growth within this sector.

The success of smartphones has resulted in a record revenue of $59.3 billion (£39 billion) for the wireless industry sector of the semiconductor business. IDC forecasts that the current double-digit growth in this area will fall to single-digit figures in 2011 and 2012 as the market stabilises and competition drives prices down.

Semiconductor revenues for memory, including DRAM and Flash, will rise to $66.7 billion (£44 billion) this year, a growth of over 52 percent compared to last year. This increase is primarily fuelled by high memory demand for the netbook, smartphone, and tablet markets, but IDC expects a potential decrease in memory sales over the next few years as supply outgrows demand.

The industrial, military, aero, and automative industries also had a huge craving for semiconductors, with revenue growth up 20 percent this year compared to last. Energy-efficient lighting and the increased usage of semiconductors in cars is expected to drive a compound annual growth rate of 13.2 percent between 2009 and 2014.

The consumer industry is less fortunate, with a modest 5.8 percent growth this year, which IDC expects will fall sharply over the next few years.

“Overall, we believe that the semiconductor market recovery seen this year is similar to the one in 2004. The 2010 growth rate based on the bottoms-up model used in the Semiconductor Application Forecaster is consistent with our top-down linear-regression model that factors in seasonality in semiconductor orders and with our scenario analysis model,” said Mali Venkatesan, research manager for semiconductors at IDC.

Venkatesan qualified this growth expectation with the possibility of a slowdown in global economic recovery due to the Euro crisis, US unemployment, and an assest bubble in the BRIC countries. He said that this may push growth from the second half of this year into 2011, but said that, despite this, smartphones, mobile PCs, tablets, and cars will still see strong growth in 2010 and 2011.