Showing posts with label Big Data. Show all posts
Showing posts with label Big Data. Show all posts

Monday, January 27, 2020

Iguazio raises $24M for its data science platform

Iguazio, a start-up based in Herzliya, Israel, raised $24 million in funding for its data science platform for real time machine learning applications.

The Iguazio data science platform helps data scientists create real-time AI applications while working within their chosen machine learning stack.

The funding was was led by INCapital Ventures, with participation from existing and new investors, including Pitango, Verizon Ventures, Magma Venture Partners, Samsung SDS, Kensington Capital Partners, Plaza Ventures and Silverton Capital Ventures.

“This is a pivotal time for AI. Our platform helps data scientists push the limits of their real-time AI applications and see their impact in real business environments,” said Asaf Somekh, co-founder and CEO of Iguazio. “With support from INCapital, Kensington Capital Partners, and our other investors, we are ready to expand our international team and reach our ambitious goals.”

http://www.iguazio.com

Tuesday, February 5, 2019

Databricks raises $250 million for Big Data Analytics

Databricks, a start-up based in San Francisco that was founded by the original creators of Apache Spark, raised $250 million in a Series E funding for its unified analytics solutions.

The company's Unified Analytics allows organizations to do data science on massive data sets. The approach addresses data silos and the gap between data processing and machine learning platforms.

Databricks said it generated in excess of $100 million in annual recurring revenue during 2018 and experienced approximately 3x year-over-year growth in subscription revenue during the last quarter of 2018.

The new funding round was led by Andreessen Horowitz. Coatue Management, Microsoft, and New Enterprise Associates (NEA) also participated. The company has now raised $498.5 million to date. Its valuation now stands at $2.75 billion.

“Databricks has gone from almost no revenue to over $100 million in annual recurring revenue in just three years, putting us among the fastest growing enterprise software companies,” said Ali Ghodsi, CEO and co-founder of Databricks. “What’s driving this incredible growth is the market’s massive appetite for Unified Analytics. Organizations need to achieve success with their AI initiatives and this requires a Unified Analytics Platform that bridges the divide between big data and machine learning.”

“Databricks is the clear winner in the big data platform race,” said Ben Horowitz, co-founder and general partner at Andreessen Horowitz. “In addition, they have created a new category atop their world-beating Apache Spark platform called Unified Analytics that is growing even faster. As a result, we are thrilled to invest in this round.”

http://www.databricks.com

Tuesday, August 28, 2018

Cloudera Data Warehouse handles 50 PB loads

Cloudera Data Warehouse entered general availability status. The service is a modern hybrid cloud data warehouse for storing, analyzing and managing data in public clouds and on-premises. The company said its hybrid, cloud-native architecture routinely handles 50 PB data workloads, delivering sub-microsecond query performance and serving clusters with hundreds of compute nodes.

NYSE is using Cloudera to run over 80,000 queries a day on petabytes of data, while adding 30 TB of fresh data daily.

"Before Cloudera, several data warehouse appliances were necessary to support our complex analytic requirements including market surveillance and member compliance analysis. Because the warehouse appliances could not scale we were forced to silo our data by market," said Steve Hirsch, Chief Data Officer, Intercontinental Exchange / NYSE.

The company also announced the availability of Cloudera Altus Data Warehouse, a data warehouse as-a-service, built with the same Cloudera Data Warehouse hybrid, cloud-native architecture.


Wednesday, February 7, 2018

Rubrik to acquire Datos IO

Rubrik, agreed to acquire Datos IO, a market leader in backup and recovery for NoSQL databases and big data file systems. Financial terms were not disclosed.

The acquisition of Datos IO will extend Rubrik’s reach into mission-critical cloud applications and databases increasingly adopted by application and DevOps teams at Fortune 500 companies.

The companies said they share a common vision for building a control plane that can automate, orchestrate, and secure data in the cloud.

Datos IO’s flagship platform RecoverX pioneers a radically new approach to comprehensive data management for modern cloud applications built on modern NoSQL databases (MongoDB, Cassandra, Couchbase, Amazon DynamoDB) and big data file systems (Cloudera, Hortonworks). Datos IO has filed 22 patents in application-aware data management for enterprise use cases of backup and recovery, test/dev refresh, in-place analytics, and cloud mobility. Fortune 100 companies have chosen Datos IO to protect and manage their cloud applications enabling digital transformation, including three of the top Fortune 15 companies and the world’s largest home improvement retailer.

“As enterprises adopt NoSQL cloud databases to undertake digital transformation and AI initiatives, the need to manage and recover applications and data is becoming top of mind. We are excited to have Datos IO join the Rubrik family to accelerate innovation in how enterprises manage and recover this modern application stack,” said Bipul Sinha, Co-Founder and CEO, Rubrik.

Rubrik Raises $180M for Cloud Data Management

Rubrik, a start-up based in Palo Alto, California, closed $180 million in Series D funding for its cloud data management solutions.

Rubrik's platform delivers automated cloud data backup, instant recovery, offsite replication and data archival capability. One Intel-powered appliance manages all data in the cloud, at the edge, or on-prem for backup, DR, archival, compliance, analytics, and copy data management. The company said it is on an annual run rate approaching $100 million.

The latest investment round was led by IVP with strong participation from Lightspeed Venture Partners and Greylock Partners, bringing total equity raised to $292 million. 

Thursday, January 19, 2017

CenturyLink Pushes into Big Data as a Service with Cloudera

CenturyLink launched Big Data as a Service (BDaaS) with Managed Cloudera, a new managed service offering that combines CenturyLink’s expertise in data and advanced analytics, network, cloud and application services with the highly secure Apache Hadoop-based data management and analytics platform from Cloudera.

CenturyLink said its BDaaS is enhanced by adding data and advanced analytics consulting services supported by a deep bench of Cloudera-certified data scientists and Cloudera Hadoop solution administrators, developers and architects. The solution, bolstered by CenturyLink’s global high-speed network connectivity, provides storage, processing, and management components deployed on CenturyLink Cloud Bare Metal servers. The bare metal private cloud environments have been certified by Cloudera.

“Forward-thinking organizations around the world are quickly becoming more agile and responsive to their customers’ needs across all channels,” said Gary Gauba, chief enterprise relationship officer and president, Advanced Solutions Group, CenturyLink. “These enterprises are seeing significant competitive advantages by better leveraging their data, and our new managed service helps them achieve their big data objectives.”

http://www.centurylink.com

Thursday, December 15, 2016

Nokia to acquire Deepfield for Big Data Analytics

Nokia agreed to acquire Deepfield, a start-up specializing in in real-time analytics for IP network performance management and security.

Deepfield, which  was founded in 2011 and is based in Ann Arbor, Michigan, developed an analytics platform that identifies over 30 000 popular cloud applications and services.  Its Internet Genome tracks how traffic runs to and through networks to reach subscribers, in real time, and without the need for probes, taps and monitors in the network itself.

Nokia said it plans couple Deepfield big data analytics with the dynamic control capabilities of open SDN platforms, such as the Nokia Network Services Platform (NSP) and Nuage Networks Virtualized Services Platform (VSP). Together, these products become the cognitive "brain" that makes real-time, automated changes to wide area networks (WANs) and datacenter networks so they can quickly adapt to changes in application demand, flow and traffic patterns. This will allow Nokia customers to drive greater network efficiency, help assure quality and enhance security - without manual intervention, and in real-time.

Nokia's service assurance and customer experience management portfolios would also leverage Deepfield's big data analytics, including per subscriber application performance, to automate actions that ensure ongoing service health and customer satisfaction.

Basil Alwan, president of Nokia's IP/Optical Networks business group, said: "We are impressed with Deepfield's unique approach to network analytics and their deployments with major providers around the globe, delivering critical visibility into how leading cloud applications and services flow through their networks. Combining Deepfield's cutting-edge analytics with Software Defined Networking techniques (SDN) will allow our customers to automate engineering and assurance processes while enhancing performance, utilization and security. We believe this capability will only increase in importance as networks and applications become more complex, diverse and dynamic."

Craig Labovitz, founder and CEO of Deepfield, said: "We are very pleased to join Nokia, a like-minded global leader in IP networking with shared values in network innovation. I look forward to leveraging the strength of Nokia's world-class customer, sales and support footprint to take our Deepfield technology worldwide. This will also give us a solid foundation from which to accelerate the creation of new value - both in the Deepfield portfolio, and in joint areas such as telemetry and automation."

http://www.nokia.com

Thursday, July 14, 2016

AT&T's Threat Intellect Sifts Billions of Security Events in Minutes

AT&T unveiled a new Threat Intellect service that takes a Big Data + machine learning approach in sifting through billions of security events captured on the AT&T backbone to identify abnormal and malicious activity based on data patterns.

This new approach, which will be the brains behind AT&T security services going forward, provides visibility into the data patterns and threat activity by using multitudes of unique threat signature data streams, analytics and intelligence. Data can come from mobile devices, applications, data centers, or through AT&T security services. The software-defined network can then apply policies to mitigate the threat.

AT&T estimates this automation will improve the speed at which we can deploy security protections by over 95%, greatly improving threat detection and resolution.

“AT&T secures more connections than any communications company in North America,” said Steve McGaw, chief marketing officer, AT&T Business Solutions. “No carrier experiences the depth and scale of security threats we see on a daily basis– more than 30 billion vulnerability scans and 400 million spam messages are detected on our IP network. The power of Threat Intellect gives us the ability to process 5 billion security events, a full day’s worth of activity for all of our security customers combined – in only 10 minutes.”

“In the past, you had to know exactly where a specific file was stored to access it. Now, you only need a key word to find that file,” McGaw said. “AT&T Threat Intellect has a similar capability. It is the power behind every AT&T firewall, network security protection and every other security capability we have integrated in our network and services.”

AT&T said 117 petabytes of traffic are traversing its network every day.

http://www.att.com




See video:  https://youtu.be/rLWKTxw1fcE

Sunday, January 10, 2016

Blueprint: How Advanced Analytics Will Change Business in 2016

by Shawn Rogers, Chief Research Officer, Dell Statistica

As advanced analytics continue the trajectory from buzzworthy idea to business imperative, we will see companies and consumers alike beginning to optimize their processes for data gathering and analysis in new ways. From healthcare to manufacturing, virtually no industry will be untouched as advanced analytics reach a new level of maturity in the coming year. Here are four specific ways I expect advanced analytics to evolve and impact businesses this year:


1. Analytics at the edge becomes the new normal as the Internet of Things (IOT) expands.

Thanks largely to advancements in modern data technology, more organizations than ever before are putting the right data on the right platform for the right reason. This new level of data efficiency greatly reduces – and, ultimately, may eliminate – the need to pull data into a centralized source, such as a data warehouse or analytic sandbox, for the purpose of analysis. Instead, given the distributed nature of connected devices and the explosive growth of IoT infrastructures, more organizations will look to execute analytics on the data-gathering devices themselves, a model commonly known as analytics at the edge. Applying a predictive model and running the analytics where the data lives eliminates the time, bandwidth and expense required to transport the data, enabling immediate action to be taken in response to insights. For example, if surveillance cameras have the ability to distinguish between routine and non-routine video images, they can transmit only the suspicious images to long-term storage, reducing cost and avoiding bandwidth issues. In 2016, the growth of IoT in particular will spur the movement of analytics out to the edge, in order to give companies the ability to harness and use IoT data quickly and economically. The power of IoT ultimately lies with the ability to analyze data and move at the real-time speed of a specific workflow. Analytics at the edge makes that possible.

2. “Citizen data scientists” continue to emerge across the workplace and shape applications of analytics.

We are starting to see a new breed of analytics users cropping up throughout organizations – non-analyst employees known as citizen data scientists. These every-day, non-technical users typically leverage a basic background in math or social sciences to analyze data as an add-on to their other workplace responsibilities, helping to fill the gap between the amount of data companies need crunched and the number of trained data scientists they can hire. Citizen data scientists are going to play a large role this year in increasing the demand for processes and interfaces that make analytics more easily digestible. As citizen data scientists experience a learning curve in wrangling data, running the optimal analytics and presenting the outcome of those insights, vendors will need to focus on delivering quick-start analytics templates and reusable workflows. Once the learning curve has been overcome and the right capabilities have been delivered, citizen data scientists will be the driving force behind the use of analytics to drive innovation.

3. Analytics will show the highest ROI for targeted, vertical market use cases.

We’ve already seen evidence that advanced analytics produce the highest ROI when applied to targeted, vertical markets. Regulated manufacturing in particular will continue to lead other industries in advanced analytics adoption and ROI in 2016, as this industry features not only numerous processes that greatly impact the precision and quality of a given production run, but often a large number of regulatory requirements necessitating data and documentation. As such, regulated manufacturers will increasingly rely on advanced analytics platforms to help them identify strengths and weaknesses in their processes, while also documenting the information they need to meet regulatory requirements. For example, pharmaceutical manufacturers might leverage advanced analytics to optimize the drug creation process and avoid catastrophic batch losses, while also using analytics tooling to confirm that their processes have been tested and validated as required by governing regulatory bodies.

4. Analytics will become the starting point for virtually all innovations.

Advanced analytics not only help companies optimize and improve processes but also better serve their customers through new innovations. This trend will grow exponentially in 2016 as organizations continue to realize the true value in leveraging predictive analytics. Service departments will have the ability to take prescribed actions to prevent issues from arising. Doctors will increasingly run analytics to offer precision healthcare and personalized medicine that better serves patients. Patients themselves will bring their own data to the table, creating a new layer of both challenges and opportunities for data-driven healthcare leaders. This trend of data-driven analytics advancing each and every aspect of the business – from inception to completion – will only continue to evolve until, ultimately, all forms of innovation trace back to analytics in some way.

If your company hasn’t begun to incorporate advanced analytics, bear in mind that 2016 is poised to mark a tipping point in the evolution of corporate data science. Until now, having the ability to leverage advanced analytics has been a competitive advantage. By this time next year, it will be status quo. Not only will companies without advanced analytics find themselves falling behind their competition, but they will have an extremely difficult time catching back up, as their competitors with analytics improve and optimize operations at an exponential, not linear, rate. It’s a time of great opportunity for those prescient enough to make bold, decisive moves. And it’s a year that will define who owns the future of innovation and who gets left behind.

About the Author

Shawn Rogers is Chief Research Officer, Dell Statistica, Dell Software.  He is an internationally recognized thought leader, speaker, author and instructor on the topics of big data, cloud, data integration, analytics, data warehousing and social analytics. Prior to joining Dell, he was Vice President of Research for Business Intelligence and Analytics at Enterprise Management Associates, a leading analyst firm. He co-founded the BeyeNETWORK, a global online publication covering business intelligence, data warehousing and analytics. He was also a partner at DMReview magazine and has held various executive level positions with technology companies.



Got an idea for a Blueprint column?  We welcome your ideas on next gen network architecture.
See our guidelines.


Monday, November 30, 2015

Iguaz.io Raises $15 Million for Big Data Storage

Iguaz.io, a start-up based in Israel, announced $15 million in Series A funding for its data management and storage solutions for Big Data, IoT and cloud applications.

The funding round was led by Magma Venture Partners, the funding includes additional investments from JVP and large strategic investors.

The iguaz.io founding team is comprised of a group of former executives from successful technology companies in the fields of storage, cloud computing, high-speed networking, analytics and cyber-security. These companies include XtremIO (acquired by EMC), XIV (acquired by IBM), Mellanox, Voltaire and Radvision (acquired by Avaya).

"Enterprise customers have been sharing their pain points and challenges with us as they try to adopt Big Data and predictive analytics in their business," said Asaf Somekh, co-founder and CEO of iguaz.io. "We designed our solution from the ground up to address these challenges and allow our customers to focus on their applications and business."

"The IT industry is undergoing major shifts with the spread of cloud technologies on the one hand and new data consumption requirements on the other," said Yaron Haviv, co-founder and CTO of iguaz.io. "We're leveraging state-of-the-art hardware with our novel software architecture enabling customers to take a leap forward in their data-driven businesses."

http://www.iguaz.io

Tuesday, November 24, 2015

Australia's Macquarie Telecom Deploys MapR

Macquarie Telecom, Australia’s leading managed hosting and business-only telecommunications company, has deployed the MapR data platform to help secure the communications of the Australian government.

Macquarie Telecom’s Government Division today secures telecommunications for 42% of government agencies in Australia. The company provides government employees with a secure Internet gateway, so they can safely access external public websites, downloads, and email services.

“With the growing complexity of the digital threat landscape and the development of SaaS and cloud services, Macquarie Telecom must secure and analyze an exponentially growing amount of data, as well as predict increasingly sophisticated cyber-attacks,” commented Aidan Tudehope, managing director of Macquarie Hosting Group at Macquarie Telecom. “Big data analytics in a hybrid cloud services model is the next big thing in complex security environments such as those of governments. We needed a comprehensive platform that is able to analyze data in the cloud and most importantly in real-time, which is what MapR is providing us with today. We considered several distributions of Hadoop, and chose the MapR Distribution as best for our needs.”

https://www.mapr.com/company/press-releases/macquarie-telecom-deploys-mapr-secure-australian-government-communications
s.”

Thursday, November 5, 2015

Cask Data Raises $20 Million for Enterprise-class Apache Hadoop

Cask Data, a start-up based in Palo Alto, California, raised $20 million in Series B funding for its enterprise-class Apache Hadoop solutions.

Cask's flagship offering, the Cask Data Application Platform ("CDAP"), provides an open source layer on top of the Hadoop ecosystem that adds enterprise-class governance, portability, security, scalability and transactional consistency. From Data Lakes to Data Apps, the CDAP platform is ideal for enterprise environments because it abstracts many layers of the Hadoop ecosystem, allowing developers to use their existing skills to build high-performance, large-scale Big Data applications. The company said its approach dramatically accelerates development of applications and deployment into production, cutting average time to implement by more than 80%, while retaining the operational controls required by today's enterprise customers. Major customers and partners include AT&T, Cloudera, Salesforce, Pet360 and Lotame.

The funding was led by Safeguard Scientifics with participation from Battery Ventures, Ignition Partners and other existing investors

"Big data has moved into the mainstream, but enterprises continue to struggle with the complexities and new skill sets required in the Hadoop ecosystem," said Cask Founder and CEO, Jonathan Gray. "Because our platform can layer on top of any distribution, instantly integrate with new and existing data stores, and easily support both Spark or MapReduce, it delivers real value for enterprises in a data-heavy environment, slashing development and deployment timelines. We are excited to be a part of the Safeguard family of partner companies. This financing, along with the operational expertise and guidance from our new board members Phil and Frank, will allow us to take Cask to the next level."

http://www.cask.co

Wednesday, October 7, 2015

Teradata Expands to AWS

Teradata, which offers a portfolio of big data analytic solutions, will offer its data warehousing as a service on AWS.

The initial version of Teradata Database on AWS will be offered on a variety of individual multi-terabyte virtual servers--known as Amazon Elastic Cloud Compute (EC2) instances--in supported AWS regions via a listing in the AWS Marketplace.

"Today, Teradata is adding more innovation by coupling data warehousing capabilities with public cloud convenience," said Tony Cosentino, Vice President and Research Director at Ventana Research, a leading benchmark research and advisory services firm. "No longer is cutting-edge technology the sole domain of the upper echelon of the corporate community; soon virtually any organization on the planet can tap into – and trust – the power of Teradata."

http://www.teradata.com

Monday, September 28, 2015

ODPi Project Envisions Open Ecosystem for Big Data

ODPi, which is a Linux Foundation Collaborative Project aimed at fostering an open ecosystem of Big Data, announced that membership in the organization has double since it got underway in February.  Members to date represent a diverse group of Big Data solution providers and end users such as Altiscale, Ampool, Capgemini, CenturyLink, DataTorrent, EMC, GE, Hortonworks, IBM, Infosys, Linaro, NEC, Pivotal, PLDT, SAS Institute Inc, Splunk, Squid Solutions, SyncSort, Telstra, Teradata, Toshiba, UNIFi, VMware, WANdisco, Xiilab, zData and Zettaset.

"ODPi is a useful downstream project for the community to work on a common reference platform and set of technologies around Hadoop," said Jim Zemlin, executive director at The Linux Foundation. "We've seen this model work with open source technologies experiencing rapid growth and know it can increase adoption and open up opportunities for innovation on top of an already strong Hadoop community."

Technical milestones include the release of an initial ODPi core specification and reference implementation that simplifies upstream and downstream qualification efforts and has been created by developers from across the Big Data landscape. More than 35 maintainers from 25 companies are dedicated to this ongoing work to start. The planned ODPi Certification Program is also underway. The goal of ODPi Certification Programs will be to ensure consistency and compatibility across the Big Data ecosystem.


http://www.ODPi.org
http://collabprojects.linuxfoundation.org/

Tuesday, September 1, 2015

SAP's New HANA Vora Software Extends In-Memory Query Engine to Spark

SAP introduced its new HANA Vora software for bringing context-aware decision to data sets encompassed by Apache Spark.

SAP HANA Vora is a new in-memory query engine that leverages and extends the Apache Spark execution framework to provide enriched interactive analytics on Hadoop. As a socket inside the Spark framework, the Vora capabilities can tap into distributed Big Data. It extends the in-memory computing innovation from SAP to distributed data and provides OLAP-like analytics with a business semantic understanding of data in and around the Hadoop ecosystem.

At a launch event in San Francisco, company executives said SAP HANA Vora is key to extending cloud growth outside of its SAP HANA customer base.

“Our mission at SAP is to empower businesses to lead the digital transformation in their industry,” said Quentin Clark, chief technology officer and member of the Global Managing Board of SAP SE.  “In order to succeed in this digital transformation, companies need a platform that enables real-time business, delivers business agility, is able to scale and provides contextual awareness in a hyper-connected world. With the introduction of SAP HANA Vora and the planned new capabilities in SAP HANA Cloud Platform, we aim to enable our customers to become leaders in the digital economy.”

“As part of our Big Data initiative, we currently have Hadoop and SAP HANA deployed in our enterprise IT landscape to help manage large unstructured data sets,” said Aziz Safa, VP and GM, Intel IT Enterprise Applications and Application Strategy. “One of the key requirements for us is to have better analyses of Big Data, but mining these large data sets for contextual information in Hadoop is a challenge. SAP HANA Vora will provide us with the capability to conduct OLAP processing directly on these large, rich data sets all in-memory and stored in Hadoop. This will allow us to extract contextual information and then push those valuable insights back to our business.”

SAP HANA Vora is planned to be released to customers in late September; a cloud-based developer edition is planned to be available at the same time.

http://www.sap.com

Wednesday, August 5, 2015

Hortonworks Hits Revenue of $30.7 million, up 154% YoY

Hortonworks, which specializes in Open Enterprise Hadoop, reported Q2 revenue of $30.7 million, an increase of 154 percent over the $12.1 million in the second quarter of 2014. There was a total GAAP gross profit of $17.5 million for the second quarter of 2015, compared to gross profit of $5.5 million in the same period last year.

"We are very pleased with our second quarter performance which was highlighted by support subscription revenue growth of 178% year-over-year and solid customer momentum with the addition of 119 new support subscription logos," said Rob Bearden, chief executive officer and chairman of the board of directors of Hortonworks. "As leading enterprise organizations continue to deploy the Hortonworks Data Platform in production at scale, as evidenced by our 144% dollar-based net expansion rate over the trailing four quarters, we could not be more thrilled to serve as their trusted IT partner during this transformational period in the data management industry."

http://hortonworks.com/

Thursday, July 23, 2015

IBM Acquires Compose for Cloud Database Services

IBM has acquired Compose, a start-up offering MongoDB, Redis, Elasticsearch, PostgreSQL, and other database as a service (DBaaS) offerings targeted at web and mobile app developers. Financial terms were not disclosed.

Compose, which was founded in 2010 and is based in San Mateo, California, offers auto-scaling, production-ready databases including MongoDB, Redis, Elasticsearch, PostgreSQL and RethinkDB. The services includes 24x7 monitoring and management by DBaaS DevOps experts. Compose features:

  • "Containerized" DBaaS platform technology – enabling fast deployment and scaling of popular open source DBaaS services for customers;
  • Auto-scaling with predictable performance;
  • Built-in redundancy, backup, failover for uninterrupted DBaaS service & application uptime;
  • Valuable add-ons including Compose Transporter, which helps developers move data between services like MongoDB and Elasticsearch for easier application development and to provide a better end-user experience.

IBM said its acquisition of Compose continues IBM's commitment to open source technology and communities across the entire cloud stack. In addition to this latest announcement, and the announcement to make the company's container technology available through Docker last month, IBM serves as a founding member of The Cloud Foundry and OpenStack Foundations, a platinum sponsor of the Node.js Foundation and a sponsor of the Open Container Project.

"Compose's breadth of database offerings will expand IBM's Bluemix platform for the many app developers seeking production-ready databases built on open source," said Derek Schoettle, General Manager, IBM Cloud Data Services. "Compose furthers IBM's commitment to ensuring developers have access to the right tools for the job by offering the broadest set of DBaaS service and the flexibility of hybrid cloud deployment."

http://www.dashdb.com
http://www.cloudant.com

Tuesday, July 14, 2015

MapR Reports Triple Digit Growth for its Apache Hadoop

MapR Technologies, a start-up based in San Jose, California, reported more than 100% growth in bookings and billings during Q2 2015 compared to the same quarter in the prior year for its Apache Hadoop solutions.

MapR processes big and fast data on a single platform, enabling real-time applications for enterprise deployments.

“New customer adoption and expanded deployments of the MapR Distribution for Hadoop have continued to accelerate as enterprise customers are realizing top-line revenue growth and operational efficiencies,” said John Schroeder, cofounder and CEO, MapR Technologies. “Our technology innovations with Apache Hadoop coupled by a proven, subscription-based licensing model, has enabled our business to grow with predictable success.”

https://www.mapr.com

MapR Raises $110 Million for Apache Hadoop


MapR Technologies, a start-up based in San Jose, California, raised $110 million in venture funding for its distribution for Apache Hadoop software. MapR has significant production Hadoop environments in financial services, healthcare, media, retail, telecommunications, and Web 2.0 companies.  The financing will be used to continue growth in the big data and analytics segment, especially to fund additional engineering resources and support...

Monday, July 13, 2015

Intel Previews Enhanced Lustre File System for HPC

Intel previewed a number of new features coming in its Enterprise Edition for Lustre 2.3, including support for Multiple Metadata Targets in the Intel Manager for Lustre GUI. Lustre, which has been in use in the world’s largest dataccenters for over a decade and hardened in the harshest big data environments, leverages an object-based storage architecture that can scale to tens of thousands of clients and petabytes of data.

New capabilities will enable Lustre metadata to be distributed across servers. Intel Enterprise Edition for Lustre 2.3 supports remote directories, which allow each metadata target to serve a discrete sub-directory within the file system name space. This enables the size of the Lustre namespace and metadata throughput to scale with demand and provide dedicated metadata servers for projects, departments, or specific workloads.

Intel is also preparing to roll out new security, disaster recovery, and enhanced support features in Intel Cloud Edition for Lustre 1.2, which will arrive later this year. These enhancements will include network encryption using IPSec, the ability to recover a complete file system using snapshots, and new client mounting tools, updates to instance and target naming, and added network testing tools.

http://www.intel.com/content/www/us/en/software/intel-enterprise-edition-for-lustre-software.html

Tuesday, June 23, 2015

AtScale Raises Funding for Business Intelligence Interface for Hadoop

AtScale, a start-up based in San Mateo, California, announced $7 million in series A funding for its business intelligence solution for Hadoop.

AtScale aims to is the glue between two fast-growing but currently disconnected markets: Big Data, estimated at $50B by Wikibon, and the Business Analytics industry, a space IDC predicts will reach $59.2B in 2018.

AtScale requires no data movement and no new visualization interface to act as the business interface for Hadoop. It provides a unified semantic layer that natively integrates with tools like Microsoft Excel, Tableau Software and Qliktech. The company says it accelerates query performance by 100 times.

The company is headed by Dave Mariani (CEO and founder), who previously was responsible for Yahoo!'s advertising and web properties. Mariani and his team processed over 30 billion events per day in Hadoop, making that data available to business users. Later, when at Klout, Mariani’s team managed a large-scale Hadoop cluster that powered a 1 trillion-row data warehouse.

http://www.AtScale.com

Enigma Raises $28 Million for its Analytics Engine

Enigma, a start-up based in New York City, announced a $28.2 million Series B funding round to support its work in data discovery and analytics.

Enigma has developed an analytics engine that can search, discover and connect billions of previously unlinked public records from thousands of governments and organizations across the world. Enigma is now focusing these capabilities on the enterprise market, bringing its unique corpus of public data to bear on the problems and operational challenges facing the Fortune 500.

The funding round was led by New Enterprise Associates (NEA) with participation from Two Sigma Ventures and New York City Investment Fund, as well as existing investors American Express Ventures, Comcast Ventures and The New York Times Company.

“Enigma was created out of a realization that there is a massive amount of hidden knowledge that is locked away in various data silos and legacy systems -- when people have the tools to make sense of it, data can drive significant and impactful change,” said Marc DaCosta, co-founder and chairman of Enigma. “Abstract and Signals make it easy for organizations of every kind to liberate their own private data and unearth new insights that will allow them to make smart decisions, increase efficiency and compete more effectively.”

http://www.enigma.io