Tag: Cloud

Intel not going bust anytime soon

Intel bus - Wikimedia CommonsChipmaking supremo Intel reported better than expected quarterly results after seeing growth in its data centres and internet of things businesses.

That apparently helped offset weak demand for personal computers that use the company’s chips.

Intel shares rose as much as 9.2 percent after market before paring some of their gains, probably as investors twigged that Intel had just cut its full-year capital expenditure forecast for the second time.

Intel  said that it was expanding its line-up of higher-margin chips used in data centres to counter slowing demand from the PC industry and agreed to buy Altera for $16.7 billion in April as part of its cunning plan.

Revenue from the data centres grew 9.7 percent to $3.85 billion in the second quarter from a year earlier, helped by continued adoption of cloud services and demand for data analytics.

Chief Financial Officer Stacy Smith was predicting robust growth rates of the data centre group, Internet of Things group and NAND businesses.

Revenue from the PC business, which is still Intel’s largest, fell 13.5 percent to $7.54 billion in the quarter ended June 27.

“Our expectations are that the PC market is going to be weaker than previously expected,” Smith said.

Research firm Gartner predicted that PC shipments would fall 4.5 percent to 300 million units in 2015, with no respite until at least 2016.

Intel forecast current-quarter revenue of $14.3 billion, plus or minus $500 million. While the cocaine nose jobs of Wall Street were expecting revenue of $14.08 billion.

The company also cut its 2015 capex forecast to $7.7 billion, plus or minus $500 million. It had cut its full-year capex forecast to $8.7 billion from $10 billion in April.

The company’s net profits fell to $2.71 billion from $2.80 billion a year earlier.

Net revenue fell 4.6 percent to $13.19 billion, but edged past the average analyst estimate of $13.04 billion. Intel’s stock fell about 18 percent this year.

Smart lighting will slash costs

Street lamps - Wikimedia CommonsInstalling smart lighting that’s connected to the internet of things (IoT) means that municipal authorities and commercial enterprises could save as much as 90 percent on energy costs.

Gartner said smart lighting will grow from 46 million units shipping this year to as many as 2.54 billion units in 2020.

Dean Freeman, author of the report at Gartner, said cutting energy costs by 90 percent is more than just installing LED (light emitting diode) units.

He said there are five elements to smart lighting and those are LEDs, sensors and controls, connectivity, analytics and intelligence.

Prices of solid state lighting have now reached the level where their use is “compelling” he said. With software on a machine, building owners will be able to analyse lighting patterns and so improve the implementation. And if the dashboard is in the cloud, then the lighting can be viewed and controlled from anywhere.

Both North America and Europe are beginning to install systems with remote management of both fixtures and bulbs and networked systems are now a reality.

Submerge your supercomputer in liquid

Yellow-Submarine-HeaderA team of boffins have discovered that if you take your supercomputer and immerse it in tanks of liquid coolant you can make it super efficient.

The Vienna Science Cluster uses immersion cooling which involves putting  SuperMicro servers into a dielectric fluid similar to mineral oil.

The servers are slid vertically into slots in the tank, which is filled with 250 gallons of ElectroSafe fluid, which transfers heat almost as well as water but doesn’t conduct an electric charge.

The Vienna Science Cluster 3 system has a mechanical Power Usage Effectiveness rating of just 1.02, meaning the cooling system overhead is just 2 percent of the energy delivered to the system.

This means that 600 teraflops of computing power uses just 540 kilowatts of power and 1,000 square feet of space.

Christiaan Best, CEO and founder of Green Revolution Cooling, which designed the immersion cooling system. “It is particularly impressive given that it uses zero water. We believe this is a first in the industry.”

Most data centres cool IT equipment using air, while liquid cooling has been used primarily in high-performance computing (HPC). But cloud computing and “big data,” could make liquid cooling relevant for a larger pool of data centre operators.

The Vienna design combines a water-less approach with immersion cooling, which has proven effective for cooling high-density server configurations, including high-performance computing clusters for academic computing, and seismic imaging for energy companies.

GlobalFoundries becomes an IBM supplier

IBM logoAs we reported yesterday, the US government cleared the acquisition of IBM’s semiconductor business by GlobalFoundries (GloFo).

And today IBM announced that that would mean to its future business.

GloFo will become IBM’s exclusive semiconductor supplier for the next 10 years – so it will make its POWER chips that go into Big Blue servers.

It’s not the end of the story for IBM semiconductor engineers, however. IBM Research will carry on doing semiconductor and material science and that will help the company to continue selling mainframes, storage and POWER systems to aid it in its push into cloud, big data and analytics systems.

IBM said that its semiconductor and material research has delivered important technology including copper chips, silicon germanium and quantum computing.

It’s good news for GloFo. The Abu Dhabi based company, which acts as a foundry to produce semiconductors for its customers, also gains access to leading IBM technology, including its state-of-the-art fabrication plant (fab) in New York State.

Samsung selects AMD for thin clients

AMD logoSamsung has selected the AMD Embedded G-Series SoC (system on chip) for a new line of all-in-one cloud monitors featuring integrated thin client technology.

The Samsung 21.5-inch TC222W and 23.6-inch TC242W are powered by AMD Embedded G-Series SoCs that couple high-performance compute and graphics capability in a highly integrated, low power design.

The AMD SoC improves data transfer rates and saves space on the motherboard, which makes it a perfect fit for the compact form factors required by thin clients.

AMD thin client solutions depend on the move to cloud-based computing. Increasingly, thin clients are also being used for more complex and high performance tasks. This is especially true for devices incorporating x86 CPUs and sophisticated graphics. Although to be fair the thin client market has not been doing so well lately so AMD pinning its hopes on it are not necessarily a good thing.

AMD Embedded Solutions vice president and general manager Scott Aylor said that thin client was a key market for AMD Embedded Solutions and he was “thrilled that Samsung has chosen to partner with us for their newest line of products.”
“The collaboration with Samsung builds on the number one position AMD holds in a market that continues to grow, becoming more and more prevalent in commercial installations that serve a broad range of markets.”

With planned availability starting in Q3 2015, the Windows-supported Samsung cloud monitors will provide customers with expanded choice, capability and configuration flexibility.

Using Samsung’s professional-grade display panel, the cloud monitors should be super-souped up. As an option for desktop virtualization,

The Embedded AMD G-Series SoC combines graphics computing circuitry with x86-based central processing cores in the same accelerated processing unit (APU) design. They are built to handle a range of embedded applications and are designed to provide power efficient graphics.

Nvidia thinks it can make a billion from the cloud

nvidiaMaker of chips that help you see things, Nvidia, expects its cloud computing revenue to hit $1 billion in the next two to three years.

It says that demand for big data analysis drives growth in graphics chips.

CEO Jen-Hsun Huang told reporters a day before the Taipai Computex show that cloud computing is the company’s fastest-growing segment, with revenue increasing at about 60-70 percent a year.

Cloud computing allows people to play graphics-heavy games over the Internet, Huang said. He also noted that the company’s GPUs can now be used for a wide variety of applications, such as voice commands like those used by Microsoft’s search engine.

However he warned that it is going to be a while before people can start playing streamed games at the 4K resolution.

While it is possible to stream 4K movies from online services like Netflix to PCs, TVs and set-top boxes, streaming games from the cloud requires many infrastructure changes, said Jen-Hsun Huang said.

Nvidia can currently stream 1080p games at 60 frames per second from its Grid online gaming service, but the technology needs to be developed for 4K streaming and a lot of fine-tuning is needed at the server level, Huang said.

“It’s going to be a while,” Huang said.

The cloud is not the only area that Nvidia has also been moving into. Lately it signed up for an automotive chip programme with automaker Tesla Motors.

Apple confirms its bought German firm

Apple blossom, Mike MageeAn Apple representative has confirmed that it has snapped up a German augmented reality company as part of its acquisition strategy.

Metaio’s speciality is technology that lets people wearing special glasses transform what they see into a kind of touch screen.

Its technology has both business and consumer applications.

A notice on its website – metaio.com – said its products and subscriptions aren’t available to buy any more, but it will continue its support until June 30th and will honour downloads of previous buys until December 15th this year.

The company offered a number of products and subscriptions including a software developer kit, an “authoring” tool, and a cloud environment.

Apple reputedly wants to use Metaio tech in its own devices. No financial details of the transaction are available.

SSD blokes eat their words

alice_in_wonderland___eat_me__by_ariru_lunaticoo-d68i2fxYour SSD will not lose data if you forget to turn on the juice claims the guy who wrote a report saying that your SDD data will die if you forget to plug it in for three days.

In a conversation with PCWorld, Kent Smith of Seagate and Alvin Cox, the Seagate engineer who wrote the report claimed we had read it wrong.

“People have misunderstood the data that they’re looking at,” Smith said.

Cox said he wouldn’t worry about losing data and the report pertains to end of life.

“As a consumer, an SSD product or even a flash product is never going to get to the point where it’s temperature-dependent on retaining the data.”

The original presentation dates back to when Cox was a chairman on a JEDEC committee It was supposed to help data centre and enterprise customers understand what could happen to an SSD.

However it was only supposed to show what happened after it had reached the end of its useful life span and then stored at abnormal temperatures. It’s not intended to be applied to an SSD in the prime of its life in either an enterprise or a consumer product.

However the five year old presentation resurfaced in a forensic computing blog as to why an SSD could start to lose data in a short amount of time at high temperatures.

The story was picked up by the International Business Times and got a little out of control they said.

From there, the internet seemed to amplify as fact that an SSD left unplugged would lose data—all citing Cox’s JEDEC presentation.

But Cox and Smith said that’s rubbish as an SSD that isn’t worn out rarely experiences data errors. Data centre use also subjects SSDs to far more “program/erase” cycles than a typical person could under any normal circumstances.

Personal drives such as this Corsair Neutron GTX have been pushed beyond 1.1 petabytes of writes before wearing out. That’s one of the criteria you’d need to lose data.

Torture tested SSDs well beyond their rated life spans using 24/7 work loads and found that they survived.

Enterprise customers also are largely immune to heat-related dead drive issues. That’s because, again, it’s a scenario for only after the SSD has been worn out. And since enterprise customers would prefer tape or other cheaper methods to backup data over an SSD, it’s an unrealistic scenario where data loss would happen to enterprise customers, Smith said.


Software in India sees a surge

Bangalore street scene - Mike Magee picEnterprise software sales in India grew by eight percent last year.

And it seems like the only way for sales there is up, according to a report from market research company Gartner.

Bhavish Sood, a Gartner research director, said that improvement in global economic conditions has meant the Indian economy is not so stressed. “Along with a new stable government at the centre, this has helped in alleviating concerns about economic growth with early signs of spending in growth initiatives beginning to emerge.”

Sood said trends in the Indian software market include software as a service (SaaS), open source software adoption, and changing buying behaviours.

Microsoft held 25 percent of market share in 2014, followed by Oracle with 13 percent, IBM with 12 percent and SAP with eight percent.

Sood said: “After the last federal election the mood of the economy has changed and we are slowly seeing a revival in IT spending, particularly in areas of digital and a nexus of forces that combine cloud, mobile, social and big data.”

India showed the highest growth in the so called BRICS (Brazil, China, India, Russia and South Africa) software markets.

Google offers cloud for those who can wait

main-01Search engine Google is offering a new cloud computing service for customers who do not need data in a hurry.

The outfit has released a new compute service on the Google Cloud Platform that costs 70 percent less on average than an equivalent standard instance in the same configuration on the Google Compute Engine.

If you use the Google Compute Engine Preemptible Virtual Machine Google can shut down the job at any time and it will be little slower.

Writing in his bog, Google Senior Product Manager Paul Nash said there were a variety of computer tasks that fit nicely into this pricing model.

The service, now in beta, would be good for fault-tolerant workloads that can be distributed easily across multiple virtual machines. Although jobs such as data analytics, genomics, and simulation and modelling can require lots of computational power, they can run periodically, or even if one or more nodes they’re using goes offline.

Google’s budget service is somewhat similar to Amazon Web Service’s Spot Instances, also designed for jobs that can be interrupted. AWS’ model is different because its price can fluctuate according to demand, whereas Google’s prices are fixed. The Compute Engine Pre-emptible Virtual Machine can cost as little as US$0.01 per instance per hour.

Google is using leftover capacity in its data centres. If demand for its services spikes, the pre-emptible virtual machines (VMs) get bumped. Users are given a 30 second warning, which should give the application time to save its current state and work. No pre-emptible VM can run for more than 24 hours straight.

Satellite imagery processing start up Descartes Labs recently required 30,000 processors to churn through a petabyte of NASA imagery. The lower costs may help the company work with even larger sets of data. Financial investment firm Citadel is another big user of cloud computing that will put these instances to work.