zData Inc. Big Data Consulting and Services - Overview and Summary
-
Upload
zdata-inc -
Category
Devices & Hardware
-
view
3.271 -
download
0
description
Transcript of zData Inc. Big Data Consulting and Services - Overview and Summary
zData Inc. OverviewEnterprise Big Data Solutions
– Preferred Channel Partner for Pivotal, EMC and Cisco– Single source for Software, Hardware and Services Procurement across multiple vendors
Experience & Capacity
– Proven Leadership Team– Over 100 Years of collective Pivotal architecture knowledge– Retail, Utilities, Banking, Government (DoD)– Global Reach US, APJ, EMEA
Training – Online Training Portal– Onsite Training: 1,3, and 5 day tracks
Services
– BI and Advanced Analytics Platform and Pilot Programs– Commercial Big Data Turn Key Solutions and Migration Path– Enterprise Data Lake Adoption
Solutions
– Fully Hosted and Managed Environments (Cloud and On-Premise)– Migration Expertise– Data Lake Design and Methodology – Services Enablement Tools
About zData
Consulting– Hadoop: Pivotal HD, Hortonworks, MapR– Greenplum– SQL on Hadoop, HAWQ– Spark and Shark
BI & Advanced Analytics Platform
Data Lake Solutions
Custom Toolsets
Solutions
BI & Analytics Platform
Store Gather, integrate, load and manage your data in the cloud or on premise
Collaborate Validate and dimensionalize data and share for contextual services
Prediction and Advanced Analytics Execute machine learning and predictive analysis through an extensive library of functions and features
Analyze and Visualize Collaborate and uncover insights through a comprehensive, diverse set of visualizations
The ZD Platform offers all the technical support and resources your company needs to get started with Big Data.
ZD Pilot Programs
Pilot Overview- 8 week program
- Collaboration Tool - ZD Chorus
- Up to 1 TB of Data
- Fault Tolerant – Mirroring
- Security – SSL and VPN - Private Clusters
- Platform, Software, and Services all Included
- Free online HAWQ & Greenplum Training course with Pilot Program subscriptionStorage + BI and Visualization + Advanced Analytics
zData’s Big Data Pilot Program is a cloud-based end to end BI and analytics solution that leverages the latest MPP performance and SQL on Hadoop technologies. This 8 week pilot program provides a testing platform that delivers all the components you will need for your next generation Big Data solution. zData takes out all of the guess work by providing a fully integrated stack and including key services to help your team get started quickly.
STORAGE
BI & VISUALIZATION
ADVANCED ANALYTICS
Data Lake
– 2 Days Tools Training– 2 Day Architecture – 1 Day Analytics Demo– 2 Days Collaboration– 2 Days Exploration
– 3 Days Tools Training– 2 Day Architecture– 3 Week Mini-Project– 2 Week Discovery Review
– 5 Days Training– Analytics Roadmap– Test Model Deployment– Multiple Iterations
Data Lake Solutions Enterprise Adoption
ZD Chorus Agile Analytics used for a Collaborative Environment
MetaELTMetadata Driven GPDB ETL
zMonitorDCA and GPDB Cluster Health Alerting and Monitoring via Nagios
zData Migration Toolkit Migrate database from Greenplum or HAWQ
zData Backup, Replication and DR.
zData Custom Toolsets
Big Data Consulting
Managed Services
Services
General ServicesGreenplumHadoopSQL on Hadoop (HAWQ)Platform as a Service
Greenplum and HAWQHadoop
Big Data Consulting General Services
Pivotal Exclusive Services
Infrastructure– Pivotal GPDB Installs– DCA Install and Upgrade
Implementation – Data Migrations – Analytics Labs – Audits/Health Checks
Infrastructure– Software only hardware support– Cisco, HP and Dell Cluster Install– Amazon (AWS) GPDB, Hadoop & BI hosted cluster support and services
Implementation – Managed Services & Cluster Co-Lo– VCE Configuration & Certification– Benchmarking– China and India Offshore Pivotal DB & Hadoop Services
Training – Onsite – Fundamentals and Advanced– Online Training Portal
Platforms- Greenplum- Hadoop- SQL on Hadoop
(HAWQ)- Platform as a Service
Big Data Consulting
GreenplumzData Inc. Greenplum Database Consulting offers a unique focus on Pivotal-based application and technology initiatives by combining leading expertise, broad coverage, global scale, and flexible delivery.
HadoopzData is versed in all applications of Hadoop, a rapidly evolving open source framework scalable for processing huge datasets in distributed systems.
SQL on Hadoop (HAWQ)From Pivotal HAWQ to Cloudera Impala, zData has the expertise to configure, implement, and train in any environment.
Spark ConsultingzData can guide you on your journey with Apache Spark, a large-scale processing engine that can replace Hadoop MapReduce providing the benefit of improved performance in real-time streaming.
Platform as a ServicezData recognizes the need for cross platform software, applications and framework offerings. zData can consult you on all of your Cloud Foundry or PaaS needs.
zData’s Focus is on reducing MTD and MTM
Mean Time to Deployment (MTD)- Use and Development of PaaS toolsets - Automated Build Scripts- Standardized Offerings- Leverage the Cloud- Best Practice Guides + over 1000
Mean Time to Migrate (MTM)- Standardized Methodology - ETL and DDL Conversion Scripts- SQL Conversion Automation Scripts
– Oracle, Netezza, TD, DB2 to GPDB and HAWQ
Reduction in Human Capital and Operational Costs
- Support Automation and Monitoring- Auto Detection and Breakfix- Centralized Center of Excellence - Remote Engineering and Managed Environments
Services Enablement efficiency & automation
Key Responsibilities
Infrastructure– Security– Workload Management– Schema Creation– Role/User/Group Creation
Low-Cost ContinuousSupport
Quarterly Health Checks
Active Investigation & Resolution
Maintenance– Explain Plan Review– Catalog Bloat– Vacuum– Partition Pruning
Development– SQL– Datamodel Design– DDL Creation– Schema Design– Stored Procedures
Tuning– Column vs. Row– SQL– PL/PGSQL & SQL Functions– Distribution Strategies
Resident & Remote DBA
*zData also offers full life cycle migration assessments and services
Quickstart – 6 Week Migrations
– Sample Data Set– Tool Connectivity– Workload Management/Tuning– Schema Creation
– Explain Plan Review– Catalog Bloat– Vacuum– Partitioning Pruning
Quickstart – 8 Week Migrations & ETL
– Sample Data Set– Pig and Hive Setup– Users and Groups
– Mapreduce– Hive SQL/UDF– Cascading– Custom ETL and BI Tools– Hbase Migration Setup
Quickstart – 8 Week Setup
– Scoop Data Load– GP Schema Setup– Table Migration
– User/Group/Roles– Demo– HDFS to HAWQ Data Movement
Services Standard Offerings
Kickstart– 4 Week Big Data Infrastructure– Rack and Stack
– Custom UCS Platform Config– Pre-build Pivotal Cisco appliance– UCS Manager Setup– UCS GPDB Monitoring
– Environment Design– Custom Configurations– Reference Configurations
*zData also offers full life cycle migration assessments and services
Kickstart – 1 Week GP to GP Migrations
– Greenplum Software Only installations– DCA Rack/Stack Install– Amazon Web Services Setup/Install
– Greenplum to Greenplum environment migrations– Backup infrastructure setup– Timing and Price dependent on volume
Kickstart – 1 Week Migrations and ETL
– O/S Install setup– Hadoop Install/Setup/Config– N=3 Environment Setup
– Hadoop environment replication setup
Services Software + Infrastructure
– Patch Management– Capacity – Memory Usage– Backups– Security
– Start/Stop– Break/Fix– Segment Recovery– Backup Restore
– Dedicated Support– Remote Support– Onsite Escalation– Problem Isolation and Resolution
– Health Checks– Capacity Planning– System Integration
Managed Services
Managed Services Greenplum and HAWQ
System Administration- Scheduled System Jobs- Disaster Recovery Planning- Patching and Upgrades- LDAP- More VRP Integration
Database Maintenance - Batch Load Monitoring- Capacity Planning and
Monitoring- Memory Management and
Settings- 24x7x365 Monitoring
Backup / Recovery- Backup Scripts- Monitor Backups- Segment Recovery
Tool Connectivity- DIA Module 3rd Party Software
Installation and Configuration- ODBC/JDBC Setup
Greenplum Managed Services
Help DeskQuarterly Health Checks
Issue Tracking, Resolution
Backup, ETL Management
Vendor Management
Key Support Responsibilities
Managed Services Hadoop
System Administration- Disaster Recovery Planning- Patching and Upgrades- Quarterly Health checks- LDAP
Cluster Maintenance - HDFS Tuning- Capacity Planning- Power Management- Memory Management and
Settings- 24x7x365 Monitoring
Monitoring- Cluster node monitoring- Named node monitoring- Map Reduce job completion
Tool Connectivity- Hive/Hbase- ODBC/JDBC Setup
Hadoop Managed Services
Key Responsibilities
Help DeskQuarterly Health Checks
Issue Tracking, Resolution
Data Integration
Vendor Management
Training
Pivotal Fundamentals 4.2
Hadoop Administrator
GPDB Advanced Training
Online Training PortalOnsite Training
Online Training Portal zData University
Unique Online Training Portal- Greenplum Developer Training
- Greenplum Database Administrator Training
- Greenplum & HAWQ Fundamentals Overview
- HAWQ Fundamentals
- Gemfire XD Fundamentals
- Data Lake 101
For Developers, Software Engineers and Power UserszData’s new Online Training Platform now offers top industry courses at the click of a button. Receive best in class training for Greenplum, HAWQ, GemfireXD, Hadoop, Data Lake Concepts and other top technologies within the Big Data ecosystem. With zData University, you are learning from the real world experience of our senior field engineers.
*zData also offers full life cycle migration assessments and services
Fundamentals Advanced
– 5 Days– Installation and Configuration– Postres SQL– MPP Architecture
– Explain Plan Review– Catalog Bloat– Vacuum– Partitioning Pruning
Administration Developer
– 3 Days – Installation and Configuration– Pig and Hive Setup– Users and Groups
– Mapreduce– Hive SQL/UDF– Cascading– Custom ETL and BI Tools– Hbase Migration Setup
Fundamentals
– Scoop Data Load– GP Schema Setup– External HDFS Calls
Onsite Training
Contact Us | [email protected]