Big Data Platform Engineer

Hyderabad, Telangana, India
Software and Services


Posted: 20 Nov 2018
Weekly Hours: 40
Role Number: 200014535
The people here at Apple don’t just build products — they build the kind of wonder that’s revolutionized entire industries. It’s the diversity of those people and their ideas that inspires the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts. Join Apple, and help us leave the world better than we found it. Imagine what you could do here. At Apple, great ideas have a way of becoming great products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Would you like to work in a fast-paced environment where your technical abilities will be challenged on a day-to-day basis? If so, Apple's Global Business Intelligence (GBI) team is seeking an experienced Big Data Platform Engineer to build high quality, scalable and resilient distributed systems that power apple's analytics platform and data pipelines. Apple's Enterprise Data warehouse landscape caters to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are integral part of business functions like Sales, Operations, Finance, AppleCare, Marketing and Internet Services, enabling business drivers to make critical decisions. We leverage a diverse technology stacks such as Teradata, HANA, Vertica, Hadoop, Kafka, Spark, Cassandra and beyond. Designing, Developing, scaling and managing these big data technologies are a core part of our daily job. The ideal candidate for this position will be able think outside of the box and should have passion for working with highly scalable fault tolerant resilient platforms with open source technologies. The candidate will be managing large scale compute and data clusters, jobs, implement performance optimizations, and building necessary tools to meet GBI SLAs.

Key Qualifications

  • Demonstrate strong problem solving skills and analytical skills.
  • Experience managing Hadoop or Spark clusters
  • Experience with running heterogenous workloads on clusters
  • Strong knowledge in Java or Scala
  • Experience with automation tools like Puppet or Ansible
  • Good understanding of Linux
  • Experience with Kubernetes and /or any other cloud environment
  • Experience with Machine learning platform is a plus
  • Experience with Python/shell is a plus
  • Demonstrate strong understanding of development processes and agile methodologies
  • Strong communication skills. Should be self-driven, highly motivated and ability to learn quick


Manage and support Hadoop and Spark platforms that run thousands of jobs per day. Build tools to manage production environments, including providing capacity projections, performance optimizations, and guidance to application teams Work with open source technologies Build proactive monitoring and alerting systems to detect anomalies and issues in production environments Build tools to identify improve job and cluster performance. And improve 'time to resolve' production problems Provide Level 2 support for the production environments and products built by other framework teams. Translate complex business requirements into scalable technical solutions. Collaborate with multiple cross functional teams and work on solutions which has larger impact on Apple business. Work with many global teams, in US, Singapore and Europe and demonstrate effective communication, both written and verbal, with technical and non-technical multi-functional teams

Education & Experience

Bachelor’s Degree or Equivalent with 4+ years of experience in big data

Additional Requirements