“At Hitachi, we usually perform a revolutionary leap followed by an evolutionary jump,” says Hitachi Data Systems Vice President(Product Planning) Michael Hay
Is Big Data another buzzword or is it really essential for companies and governments to find new ways of managing it?
Big Data is a bit of both. Earlier cloud was the buzzword, but now we are moving towards actual adoption of cloud based systems. I think this is a natural progression.
When Big Data or cloud or something initially starts out, there is a lot of hype. Eventually things move into an implementation phase where people realise the value. For instance, early on VMware, as a start-up, had a lot of buzz and promise with virtualisation, then there was silence for seven or eight years. However, during the last two or three years, VMware has proved itself to be a big deal.
Gartner calls this the Hype Cycle and so this is natural. Big Data has lot of hype around it, but it needs to be taken seriously. Big Data will not solve all our problems. People have huge expectations, but Big Data will not print money. It will be of help in enabling organisations to save money; it might even open new opportunities.
What steps should government departments take in a country like India, where the reach of IT is still quite mediocre, to take full advantage of Big Data technologies?
As IT gets deployed in ever larger areas, someone needs to pick up the role of “the dreamer.” Whenever a new deployment of IT is happening, we should also focus on extracting data and analysing it from different perspectives. Without right kind of analysis, the data is practically useless.
What kind of solutions would you like to provide to government organisations in India and in other countries?
We are in talks with a few government organisations in India currently. Internationally, one that has been indirectly spoken about is the archive for George W. Bush at the National Archiving Records Administration in the United States. We were able to build our system and it uses our technology around the content platform.
Our storage is used to warehouse the content of the previous president. We are engaged with other governments quite heavily in the APAC region too. For any government organisation we look to take our current portfolio and work with key system integrators that specialise in government deployments to build a specific solution in mind.
There used to be a trend in many governments to build complete independent ICT offerings themselves and there has been a transition towards COTS (Common Off The Shelf Systems) which is meant to procure more standard components and then use system integration techniques to adapt those to particular problems in the government.
In which sectors is Big Data most useful?
Government may be an early user of Big Data; certainly security applications have been talked about, like the ability to mine a lot of unstructured data on the Internet to secure people or look for scary things in crowds. For example, Hitachi has recently unveiled a passive explosive detection system. It puffs air as someone walks through a gate to detect traces of explosive. There are a variety of uses that we can potentially talk about and it depends on what the target is.
What kinds of solutions are available for e-Governance?
Governments using traditional software packages from Oracle or Microsoft can use best-in-class platforms from Hitachi, where the stability, scalability and performance of the systems can be improved, so that as they are being deployed, they are assured that they have highly reliable infrastructure to depend upon for these deployments.
Hitachi Data Systems has deployed a unified storage with virtualisation. Tell us more about it.
There are a couple of different areas of virtualisation that predominantly talk about block virtualisation. On top of that block virtualisation we can add file or object type platforms that access data in different semantics in addition to the traditional block interface. That virtualisation layer is quite important because it means that if customers have third party storage assets or all of Hitachi’s assets, we can reuse that capacity over a longer period of time, and even when it is older Hitachi assets for non-destructive migration where we can move customers to the new platform easily.
These layers become critically important for us to facilitate a continuous migration of new technologies over time for our customers. We also offer virtualization capabilities at the file level with our NAS product so it can sit In front of content and other third party devices using standard protocols again. So the notion of re-leveraging old assets to make them last a little longer for improved operation and total cost of ownership is achieved.
How is the technology evolving over a period of time?
At Hitachi, we usually perform a revolutionary leap followed by an evolutionary jump. If you look at the USP to USP-V generation, you see the evolutionary jump whereas from the USP-V to VSP was a revolutionary leap. Across platform generations our users can experience consistent applicability of our key storage features like block and file virtualisation, data protection, and storage management. Most recently, Hitachi has scaled-down our enterprise microcode from the VSP and we have placed it in the HUS-VM. This represents a revolutionary leap with consistency playing a major theme because the microcode is essentially the same.
In what ways is Big Data enabling a move towards the cloud? Please provide us with an overview of the cloud based Big Data systems developed by Hitachi.
What we’re seeing is that for the private sector and public sector depending on clouds is becoming a challenge especially if data has tobe managed at different locations. The idea of capturing data and pumping it into the cloud, specifically private cloud infrastructures, for unstructured and semi-structured data is an essential first step. For HDS the Hitachi Data Ingestor and the Hitachi Content Platform play a big role in this area, and certainly those types of systems have been picked up by government organisations and private organisations around the world to help them build private cloud infrastructures.
What solutions do you have for the healthcare sector?
HDS has a big focus in the healthcare sector through a vertical business focus in Healthcare and Life Science with key people focused on selling to hospitals, life science and to some extent pharma. In particular we have a product called the Hitachi Clinical Repository, which is kind of like an unstructured data warehouse for healthcare information. Further HDS has been solving some interesting problems with organisations, in the United States, with our complete portfolio. For example building electronic patient records or pumping medical studies from remote clinics back to a centralized core with HCP.
What solutions do you have for the education sector?
We have done a lot with e-research, and again, the Hitachi Content Platform plays a key role in terms of helping our customer in Australia build digital research spaces. Essentially the Content Platform and the Data Ingestor facilitate scientific engagement and collaboration, and all the while help to govern and retain the resulting data for the long haul.
Any other interesting developments you would like to discuss?
There are a couple of interesting things in the recent announcements we have had over the past several weeks. We have an interesting demonstration robot. Her name is EMIEW, and she has some pretty cool technology behind her. She can recognise people’s faces and find lost objects from pictures. There is some interesting technology for visual search and object recognition that she relies on. Governments are also really sensitive to long term preservation of data. They have to store data for the life of the country. Some of the research we have done on really long term storage media with quartz glass has also recently been announced.