![]() |
Lee Byung-song, left, a senior executive at SK C&C, poses with Jeff Markham, technical director handling Hortonworks' Asia-Pacific region, after the two companies signed a memorandum of understanding to collaborate over big data business. The signing was held in SK's main office in Seoul, Wednesday. / Courtesy of SK C&C |
By Kim Yoo-chul
SK C&C, a unit of SK Group, announced a strategic partnership with Hortonworks of the United States over "big data" business, Wednesday.
It will help accelerate the adoption of Enterprise Apache Hadoop by deeply integrating the Hortonworks Data Platform with SK C&C's various big data analytic tools, according to officials.
Big data refers to a massive volume of both structured and unstructured data that is quite difficult to process using traditional database and software techniques. It can help companies improve operations and decision making.
"The memorandum of understanding (MOU) with Hortonworks will lay a firm ground for us to develop big data solutions and related technologies to be used in the telecommunications, finance, security, manufacturing service and semiconductor industries," said Lee Joon-ho, head of SK C&C's public relations office.
Lee declined to provide further details about the agreement.
Since 2009, SK C&C has been collaborating with other SK Group of technology affiliates such as the group's content unit of SK Planet, telecom affiliate of SK Telecom and fixed-line operator of SK Broadband, for big data business projects.
SK C&C launched a task force to develop and fine tune its big data-related businesses, last year ― a move aimed at finding new growth opportunities in the segment.
The partnership with SK C&C will also help Hortonworks sell its data platform to SK affiliates and SK's local partner companies.
SK C&C and Hortonworks have committed to integrating their engineering strategies.
Lee said SK C&C will ask its American partner to jointly run programs to nurture experts for Hadoop solutions.
Hortonworks is the only 100-percent open source software provider to develop, distribute and support an Apache Hadoop platform explicitly architected, built and tested for enterprise-grade deployments.