GPU Machine Learning Engineer
London, Greater London, United Kingdom
Software and Services
Apple’s GPU Compute and Advanced Rendering team provide a suite of high-performance GPU algorithms for developers inside and outside of Apple for iOS, macOS and Apple TV. Our efforts are currently focused on the key areas of linear algebra, image processing, machine learning, and ray tracing, along with other projects of key interest to Apple. We are always looking for exceptionally talented individuals to grow our top-notch team.
- Excellent programming and problem-solving skills
- Good understanding of machine learning fundamentals
- Experience with system-level programming and computer architecture
- Background in mathematics, including linear algebra and numerical methods, is a plus
- Experience with high-performance parallel programming, GPU programming experience a plus
- Experience with machine learning libraries (TensorFlow. PyTorch); adding runt time and computation graph support is a plus
Our team is seeking extraordinary machine learning and GPU programming engineers who are passionate about providing robust compute solutions for accelerating machine learning networks on Apple’s GPUs. Work includes defining and implementing APIs in Metal Performance Shaders, investigating new algorithms, adding support for high-level machine learning libraries on GPUs, adding optimizations in machine learning computation graph, performing in-depth analysis and kernel-level optimizations to ensure the best possible performance across GPU families. The role has the opportunity to influence the design of compute and programming models in next-generation GPU architectures.
Education & Experience
Technical BS/MS degree. PhD is a plus.
- Strong communication and collaboration skills
- Strong track record of building high performance, production quality software on schedule.