Wednesday, October 10, 2012

AT&T and IBM going to work together




AT&T, IBM Share Network to Lure More Customers to Cloud



AT&T Inc. (T) and International Business Machines Corp. (IBM) are combining their resources in a joint offering to seek a bigger slice of the $14 billion market for cloud services.

IBM, the world’s biggest computer services provider, will provide the data-storage facilities and services, and AT&T, the largest U.S. phone carrier, will offer the global network that clients will use to retrieve the data, the companies said today in a statement. They’ll split revenue from the deal. “Cloud services” refer to storing information and software in off-site data centers.

To improve its cloud offering, IBM is forging the closest relationship it’s ever had with a phone carrier. The Armonk, New York-based company will be the first to get direct access to the technology that controls AT&T’s network for business customers, and the two companies’ sales forces will coordinate on work with clients.

Giving AT&T’s customers the option of adding cloud services from IBM on a network they already use will be “huge,” said Andy Geisse, head of the phone company’s unit for business clients. “With our customers, we see cloud computing as a key part of their network infrastructure and of their compute environment going forward,” he said in a phone interview.

Working Together

As competitors from Verizon Communications Inc. (VZ) to Oracle Corp. (ORCL) and SAP AG (SAP) expand their cloud offerings, AT&T and IBM concluded that working together would help them seize a greater portion of the market. The computer industry will reap $14 billion this year from cloud services, up 24 percent from 2011, according to research firm IDC in Framingham, Massachusetts.

IBM is pursuing a target of $7 billion in cloud revenue by 2015 after such sales more than tripled last year. It hasn’t disclosed how much it currently makes from cloud computing.

“This is a major component of how we get to that $7 billion,” said Erich Clementi, senior vice president of IBM’s global technology services division.

With its access to the controls of Dallas-based AT&T’s network, IBM will be able to shift capacity on the network so it can push the flow of data to where it’s most needed, like to a client’s supply-chain management systems during a holiday season, Clementi said.

More Control
Customers will be able to decide how to store information in IBM’s data centers and access it through AT&T’s network. AT&T has been increasing the control it gives businesses through products that let employers block certain applications or websites from employees and run anti-virus software. IBM markets its cloud products to larger businesses that need tailored services, like ways to run a supply chain or manage health data.

IBM fell (IBM) less than 1 percent to $207.99 at the close in New York. The shares have gained 13 percent this year. AT&T, up 23 percent this year, dropped 1.4 percent to $37.14 today. Separately, IBM expanded its PureSystems lineup of computing, storage and networking technology to include three new products -- one for rapid transactions and two for analyzing data to help companies make faster decisions.

The products, which can be used within hours of setup, will help contribute toward the $16 billion the company expects to make from analytics by 2015, said Arvind Krishna, general manager of IBM’s information management software unit. IBM hasn’t disclosed how much revenue it now gets from analytics.

Friday, September 28, 2012

IBM's Watson (Supercomputer)




IBM's Supercomputer Watson may head to the cloud 


IBM is looking to expand its Watson supercomputer into a cloud-based service that can be used by a variety of different types of professionals. The move could end up making IBM's supercomputer readily available for the average physician or lawyer. "We want broad exposure for Watson. We want physicians all over the planet to be able to use it," solutions marketing manager at IBM John Gordon said to New Scientist.

"And we are now looking at ways of delivering Watson as a service to make sure that it is something that is very accessible and which doesn't require a significant level of technology investment by the user."
Watson is IBM's five-year-old supercomputer. The powerful machine has the ability to shuffle through mass quantities of data and give users answers to a variety of quires. To prove Watson's answer giving prowess IBM put the supercomputer on Jeopardy last year. The supercomputer beat out former Jeopardy winners on the Alex Trebek hosted game show in January 2011.

Watson is already in use at some major firms. The supercomputer is said to be helping Citi Group make financial decisions by calculating risk-assessment portfolio's for some of the company's clients.
A move to the cloud would mean a more service-based approach for Watson. The cloud could allow 

Watson to adapt to individual user needs and learn user preferences. IBM puts forth the example of Watson giving user-based preferences for chemotherapy treatments. The company says a cloud-based Watson could learn about a patient who would prefer treatment that doesn't cause hair loss on a case-by-case basis.
As services in the cloud continue to expand its interesting to see what types of technology can grow from being part of the cloud. Companies are no longer just using the cloud for storage; they are also now building 
complete services in the ether. It will be interesting to see where the trend ends up going.

Lets just hope it doesn't wind up with SkyNet and time-traveling robots.

New Virtualization storage


Hitachi Unveils SMB Storage Array



Midrange Hitachi Unified Storage VM mixes block, file, and object storage with heterogeneous virtualization under a single management interface.

Hitachi Data Systems on Tuesday announced the Hitachi VM system, a new midrange array for small and midsized businesses (SMBs) that fits in the middle of Hitachi storage between the entry-level Hitachi Unified Storage (HUS) system and the high-end Hitachi Virtual Storage Platform (VSP).

The HUS VM is intended for use among businesses of 500 to 1,000 employees and can support as much as 4 PB of data or 64 PB of virtualized data under a single management interface. It comes in two configurations: one a controller-only version that can front-end and virtualize other vendors' arrays, and second, a version with both controller and solid state and serial attached SCSI (SAS) disk drives. Last year, Hitachi dispensed with the use of both FIbre Channel and SATA disk drives, saying that they could get more economy with the use of SAS drives. As many as 1,152 SAS drives and 128 solid state drives can be accommodated.

The HUS VM supports CIFS, NFS, Fibre Channel, and iSCSI connectivity to Ethernet and Fibre Channel SANs. It uses the same microcode as the Hitachi VSP and is managed by Hitachi Command Suite 7. Included with the HUS VM are adapters and reference architectures for use with VMware, Microsoft, and Oracle, which enable management from these platforms' management consoles and allow backup and recovery from them as well. Reference architectures for SAP and virtual desktop infrastructures (VDI) are also included.

Each controller in the HUS VM is actually dual, sharing up to 256 GB of cache memory. The 5U controller can support as many as 60 TB logical unit numbers (LUNs). The file module is 3U high and contains 32 GB of cache memory for each of its four clustered nodes. A total of 128 file systems support a maximum size of 256 TB. The array has 32, 8-Gbps Fibre Channel connections and connects to storage with 6-Gbps SAS. In addition, Hitachi offers a 100% availability guarantee with the HUS VM and, according to company claims, the HUS VM offers data migration 90% faster than competitive systems.

The HUS VM controller only version starts at $156,000.


Wednesday, September 26, 2012

New storage technology..(Quartz Glass)


Good to welcome this invention



Hitachi is showing off a storage system using quartz glass that it claims will retain data for hundred of millions of years. 
Company researchers displayed the storage unit, consisting of a sliver of glass 2cm square and 2mm thick, which can hold 40MB of data per square inch, about the same as a standard CD. The data is written in binary format by lasering dots on the glass in four layers, but the researchers say adding more layers to increase storage density isn't a problem.
"The volume of data being created every day is exploding, but in terms of keeping it for later generations, we haven't necessarily improved since the days we inscribed things on stones," Hitachi researcher Kazuyoshi Torii told AFP. "The possibility of losing information may actually have increased," he said, pointing out that CDs and tape storage are predicted to last less than a few decades at best, and in many cases fail within years.
The glass has been shown to retain its data undamaged after being heated to 1,000° Celsius (1,832° Fahrenheit) for over two hours, and is impervious to radiation, water, and most forms of chemicals. Hitachi said the data could conceivably be retrievable hundreds of millions of years in the future.
"We believe data will survive unless this hard glass is broken," said senior researcher Takao Watanabe.

Let's see how to retrieve the stored data

Storing the data is one thing, but reading it is quite another. The researchers say, however, that as it is stored in a simple binary format, actually retrieving the data should be possible for future civilizations as the dots can be read using a simple microscope.
The problem of writing and reading future storage mediums isn't new. NASA's golden record, a disc containing images and sounds from Earth that went out with the Voyager 1 and 2 probes, was shipped with a stylus and cartridge, along with pictorial references showing how to play it and a cover showing Earth's location.
Aliens reading our data might seem inconceivable, but if the glass storage really does last for a hundred million years, it's possible that mankind may not be around to read it either. A paper by Cornell University suggests the average lifespan of a species on Earth is around 10 million years, and given the way humans are fouling their own nest it could be up to the whatever evolves from bees, ants, or dolphins to figure out what these glass things are. 



Thursday, January 19, 2012

SOPA (Its not a act its a censor for our freedom)




What is SOPA? SOPA is an acronym for the Stop Online Piracy Act. It's a proposed bill that aims to crack down on copyright infringement by restricting access to sites that host pirated content.

SOPA's main targets are "rogue" overseas sites like torrent hub The Pirate Bay, which are a trove for illegal downloads of movies and other digital content.

Content creators have battled against piracy for years, but it's hard for U.S. companies to take action against foreign sites. So SOPA's goal is to cut off pirate sites' oxygen by requiring U.S. search engines, advertising networks and other providers to withhold their services.

That means sites like Google wouldn't show flagged sites in their search results, and payment processors like eBay's PayPal couldn't transmit funds to them.

For more on this Act plz visit http://en.wikipedia.org/wiki/Stop_Online_Piracy_Act

Wednesday, January 18, 2012

Aakash: A Historical Mistake By Indian Govt?

After reviewing the world’s cheapest tablet PC, Aakash, Syed Firdaus Ashraf of Rediff.com wrote, “Here’s my advice: (excuse me the scream but) DON’T BUY AKASH TABLET. Period.” A similar verdict was given by Jaimon Joseph of CNN-IBN, who said, “For the price it is being offered at, the Aakash is probably great value for money. But the question is, is it the best our students deserve. I think not.” Prasanto K Roy echoed similar emotions on BBC, “Probably the biggest challenge for the Aakash will be to keep up with the times. That’s what killed the Simputer – other than apps, by the time they tweak it and test it, portable computers will have jumped a generation.”

Despite much fanfare, the device earned enough criticism from all quarters. And it also needs to be noted that it was not written off just like that. Experts feel that instead of focusing on the price and compromising on quality, it would have been better to create a feature-rich device and then work on bringing down its price.

Commenting on the failure of the device, Satish Jha, head of One Laptop Per Child (OLPC) India Foundation, said, “I think MHRD (ministry of human resource development) should not have gone for something like Aakash for two reasons. First, the best technology in PC is available in America and the cheapest components are available in China. So a tablet PC can be imagined only if America and China come together. Hence, there is no way to make a cheap and good device without America and China coming together.”

He added, “This is also the reason for disbelief in the product. People who have tested the device are shocked with the Government’s decision to invest in the project. In principle, I also support the desire to make cheaper computers, but it should be able to deliver what it has promised. It’s not just about creating a device but also about education of the students in India. If you ask a wrong question, you get a wrong answer. India asked a wrong question of making a good computing device at $35 and it got a wrong answer in terms of Aakash.” Jha said that it’s not possible to create a meaningful computer in less than $300. Anything less than that is a mere compromise.

An overhyped device?

Aakash was expected to be the game changer. Media gave it extraordinary hype, which was well deserved for the value it was expected to bring to the Indian education system. An affordable computing device is still a dream for many students in India, where Aakash surely stood apart. But who would want to invest in a device, which is quoted as a ‘punishment for the students’ by experts.

Also, Aakash was supposed to enable millions of Indian students with a $35 solar-powered touch tablet, which turned out to be a false claim. As soon as Datawind started delivering Aakash, the reality of the device surfaced. A Zambotimes report rightly calls the project as ‘political pandering to nationalism’ which has its own set of apologists and defenders. But one thing which was criticised from all quarters was the fact that it was a bad ‘technological compromise’.

It is worth mentioning here that the government has plans to dump Aakash because of the contractual and non-performance issues. In an attempt to save the opportunity, Datawind has promised that the remaining 70,000 devices (which will be delivered by the month’s end), will be Aakash 2, the upgraded version of the device. That’s not all. The government has this time ensured that students do not get faulty devices.

The new devices will have to undergo a new quality protocol, which has been prepared on basis of responses received from over 600 students of Indian Institutes of Technology (IITs) and other engineering colleges. Over a million people have booked Aakash because it was backed by the government. Yet, there is nothing much to lose as the booking pattern involved no booking amount, which was a unique format, and is also working well for the people, considering the flak that the device has received from all corners.