Gathering your results ...
4 days
Not Specified
Not Specified
Not Specified
<p>Description</p> <p>Amazon is on a mission to redefine the future of automation - and we're looking for exceptional talent to help lead the way. We are building the next generation of advanced robotic systems that seamlessly blend cutting-edge AI, sophisticated control systems, and novel mechanical design to create adaptable, intelligent automation solutions capable of operating safely alongside humans in dynamic, real-world environments.</p> <p>At Amazon, we leverage the power of machine learning, artificial intelligence, and advanced robotics to solve some of the most complex operational challenges at a scale unlike anywhere else in the world. Our fleet of robots spans hundreds of facilities globally, working in sophisticated coordination to deliver on our promise of customer excellence - and we're just getting started.</p> <p>As a Sr. Applied Scientist in Robot Perception, you will be at the forefront of this transformation. You will develop and deploy state-of-the-art perception algorithms that enable robots to truly understand and interact with the physical world - bridging the gap between theoretical research and real-world impact. Bringing deep expertise in Computer Vision and a nuanced understanding of the capabilities and limitations of modern Vision-Language Models (VLMs), you will innovate boldly and push the boundaries of what's possible. Our vision for the Perception layer is ambitious: to enable seamless, intelligent interaction between the user, the robot, and its environment.</p> <p>This is a rare opportunity to work at the intersection of deep learning, large language models, and robotics - contributing to research that doesn't just advance the field, but reshapes it. You will collaborate with world-class teams pioneering breakthroughs in dexterous manipulation, locomotion, and human-robot interaction, all at an unprecedented scale.</p> <p>Join us in building intelligent robotic systems that will define the future of automation and human-robot collaboration.</p> <p>Key job responsibilities</p> <ul> <li>Design, develop, and deploy perception algorithms for robotics systems, including object detection, segmentation, tracking, depth estimation, and scene understanding </li><li>Lead research initiatives in computer vision, sensor fusion and 3D perception </li><li>Collaborate with cross-functional teams including robotics engineers, software engineers, and product managers to define and deliver perception capabilities </li><li>Drive end-to-end ownership of ML models - from data collection and labeling strategy to training, evaluation, and deployment </li><li>Mentor junior scientists and engineers; contribute to a culture of technical excellence </li><li>Define and track key metrics to measure perception system performance in real-world environments </li><li>Publish research findings in top-tier venues (CVPR, ICCV, ECCV, ICRA, NeurIPS, etc.) and contribute to patents </li></ul> <p>A day in the life</p> <ul> <li>Train ML models for deployment in simulation and real-world robots, identify and document their limitations post-deployment </li><li>Drive technical discussions within your team and with key stakeholders to develop innovative solutions to address identified limitations </li><li>Actively contribute to brainstorming sessions on adjacent topics, bringing fresh perspectives that help peers grow and succeed - and in doing so, build lasting trust across the team </li><li>Mentor team members while maintaining significant hands-on contribution to technical solutions </li></ul> <p>About the team</p> <p>Our Industrial Robotics Group is a diverse group of scientists and engineers passionate about building intelligent machines. We value curiosity, rigor, and a bias for action. We believe in learning from failure and iterating quickly toward solutions that matter.</p> <p>Basic Qualifications</p> <ul> <li>PhD in engineering, technology, computer science, machine learning, robotics, operations research, statistics, mathematics or equivalent quantitative field </li><li>3+ years of building machine learning models for business application experience </li><li>Experience programming in Java, C++, Python or related language </li><li>Experience with neural deep learning methods and machine learning </li><li>Have publications at top-tier peer-reviewed conferences or journals </li></ul> <p>Preferred Qualifications</p> <ul> <li>Experience with large scale distributed systems such as Hadoop, Spark etc. </li><li>PhD in Robotics, with a focus on Robot Perception </li><li>Experience leading research initiatives in full-stack robotics or foundation models </li><li>Track record of successful production robotics deployments </li><li>History of technical leadership and team mentorship </li><li>Experience bridging research with practical engineering implementation in robotics systems </li><li>Extensive programming skills in Python and PyTorch/JAX </li></ul> <p>Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status.</p> <p>Los Angeles County applicants: Job duties for this position include: work safely and cooperatively with other employees, supervisors, and staff; adhere to standards of excellence despite stressful conditions; communicate effectively and respectfully with employees, supervisors, and staff to ensure exceptional customer service; and follow all federal, state, and local laws and Company policies. Criminal history may have a direct, adverse, and negative relationship with some of the material job duties of this position. These include the duties and responsibilities listed above, as well as the abilities to adhere to company policies, exercise sound judgment, effectively manage stress and work safely and respectfully with others, exhibit trustworthiness and professionalism, and safeguard business operations and the Company's reputation. Pursuant to the Los Angeles County Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.</p> <p>Pursuant to the San Francisco Fair Chance Ordinance, we will consider for employment qualified applicants with arrest and conviction records.</p> <p>Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you're applying in isn't listed, please contact your Recruiting Partner.</p> <p>The base salary range for this position is listed below. Your Amazon package will include sign-on payments and restricted stock units (RSUs). Final compensation will be determined based on factors including experience, qualifications, and location. Amazon also offers comprehensive benefits including health insurance (medical, dental, vision, prescription, Basic Life & AD&D insurance and option for Supplemental life plans, EAP, Mental Health Support, Medical Advice Line, Flexible Spending Accounts, Adoption and Surrogacy Reimbursement coverage), 401(k) matching, paid time off, and parental leave. Learn more about our benefits at https://amazon.jobs/en/benefits.</p> <p>USA, CA, SAN FRANCISCO - 192,200.00 - 260,000.00 USD annually</p>
POST A JOB
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!
It's completely FREE to post your jobs on ZiNG! There's no catch, no credit card needed, and no limits to number of job posts.
The first step is to SIGN UP so that you can manage all your job postings under your profile.
If you already have an account, you can LOGIN to post a job or manage your other postings.
Thank you for helping us get Americans back to work!