Data Scientist Specialist

World Wide - Remote

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Invisible Technologies

We've trained 80% of the world's top AI models. Now, we'll make them work for you.

View all jobs at Invisible Technologies

Apply now Apply later

What You’ll Do

 

You’ll design and deliver machine learning solutions that blend cutting-edge research with practical deployment. Working across industries and use cases, you’ll tackle challenging problems in computer vision and build robust models that power client-facing tools and internal platforms.

  • Develop End-to-End CV Models: Design, train, and evaluate deep learning models (e.g., YOLO, optical flow, ResNet architectures) tailored to complex visual data pipelines and novel problem domains.
  • Collaborate on Real-World Solutions: Partner with software engineers and fellow data scientists to integrate research-driven models into deployable systems that operate in dynamic production environments.
  • Client-Focused Problem Solving: Work closely with external stakeholders to frame ambiguous problems, explore solution paths, and translate technical insights into impactful results.
  • Explore, Iterate, Validate: Lead the exploration and analysis of large datasets using tools like Pandas, NumPy, and Spark to inform model design and evaluate performance under realistic constraints.
  • Research-Driven Innovation: Identify when state-of-the-art ideas can be adapted and productionized in service of the problem at hand.



What We Need

 

Professional Experience:

  • 2+ years of hands-on experience building computer vision models using modern deep learning techniques.
  • Proven ability to take models from prototype to production in Python-based workflows.
  • Experience working in client-facing roles or directly engaging with stakeholders to refine project requirements.

Technical Expertise:

  • Strong proficiency in Python and commonly used ML/data libraries (Pandas, NumPy, PyTorch, etc).
  • Familiarity with core computer vision tools and model types (YOLO, ResNet, segmentation models, optical flow, etc).
  • Experience working with large-scale datasets using distributed tools such as Spark.
  • Comfort navigating cloud environments (GCP, AWS, or similar); Databricks experience is a plus.
  • Solid grasp of experimental design, model evaluation, and data debugging practices.
  • Strong code hygiene: able to write clean, modular, testable code in collaborative environments.

Bonus (Nice to Haves):

  • Familiarity with MLOps practices (model tracking, versioning, reproducibility).
  • Experience contributing to multi-model systems or pipelines built by larger teams of data scientists and engineers.

 

We offer a pay range of $35-to- $50 per hour, with the exact rate determined after evaluating your experience, expertise, and geographic location. Final offer amounts may vary from the pay range listed above. As a contractor you’ll supply a secure computer and high‑speed internet; company‑sponsored benefits such as health insurance and PTO do not apply.

Important:

All candidates must pass an interview as part of the contracting process.

Apply now Apply later
Job stats:  1  0  0
Category: Data Science Jobs

Tags: Architecture AWS Computer Vision Databricks Data pipelines Deep Learning GCP Machine Learning MLOps Model design NumPy Pandas Pipelines Python PyTorch Research ResNet Spark YOLO

Perks/benefits: Career development Health care

Region: Remote/Anywhere

More jobs like this