Senior Data Scientist

  • Temps plein

We are looking for a curious and creative Data Scientist to join our team on an exciting new project that will be shaping the future of energy with AI.

This is a flexible hybrid role, ideally based in Iasi or Bucharest - perfect if you like the mix of deep focus and quality team interactions.

If you are obsessed with AI, curious to learn more about the energy world and ready to build something that truly matters, we would love to meet you!

The Gig

OGRE is a AI-based tech company, aiming to empower the energy sector with an innovative energy forecasting and optimization platform. We are already delivering our service to multiple Romanian and international clients aiming to optimize their energy flows, reduce costs for end consumers, minimize waste and reduce carbon emissions.

We are a small team of smart, ambitious engineers and data scientists who share a passion for building great solutions. With solid academic backgrounds – including research experience from Oxford, Cambridge and Berkeley – and complementary expertise, we are keen to disrupt the energy sector with a scalable and user-friendly energy forecasting and optimization platform.

If you will ask our colleagues what they value the most about working at OGRE, you’d probably hear:

  • the opportunity to work on technology with real-world impact for the future.

  • a clear and efficient task allocation process.

  • a healthy culture of collaboration, learning, and solution-oriented thinking.

What we’re looking for:

We are growing and looking for an experienced, self-driven Data Scientist to join us in building the AI Energy Hub platform. If you are excited by AI, thrive on hands-on work and want to make a tangible difference in the energy sector – let’s talk. Below, you will find more details about this role and what you will have at OGRE.

Minimum requirements:

  • Bachelor’s degree in computer science, information systems or related field.

  • 4 years of relevant work experience.

What we offer:

  • Hybrid work with 1-2 days in the office per month. We have an ultra-modern office in Bucharest, in one of the Globalworth buildings.

  • Build a platform that helps global clients optimize their energy flows and reduce carbon emissions.

  • Work alongside some of the sharpest minds in AI and energy.

Benefits

  • Attractive compensation package

  • Flexible working hours

  • Permanent employment contract

  • High-tech hardware

  • Friendly and supportive team

What will you be doing?

  • Model energy flows across generation, storage, EV charging and consumption using time-series and optimization algorithms.

  • Design smart features from weather, tariffs, telemetry and grid signals to derive actionable insights.

  • Build reliable pipelines for ingesting and transforming high-frequency data from SCADA, IoT and EMS systems.

  • Deliver intelligent outputs, like forecasts, anomaly detection, control recommendations, that enable real-time decisions.

  • Automate & monitor model training, deployment and performance across multiple client sites.

  • Collaborate across teams to turn complex energy problems into simple, scalable AI-driven solutions.

What the ideal candidate should bring:

  • Strong knowledge of statistics, as well as of research and development of machine learning models.

  • Excellent programming skills in Python, including experience with relevant data science and machine learning libraries (e.g., Pandas, NumPy, Scikit-learn, TensorFlow).

  • Ability to efficiently work with large datasets and use SQL or other database technologies for data querying.

  • Previous experience in energy optimization or working on optimization or scenario simulation projects.

  • Teamwork skills and the ability to collaborate with engineers, product teams, and other colleagues to achieve organizational goals.

These skills are very nice to have:

  • Understanding and practical use of meteorological data: GRIB/GRIB2 & NetCDF ingestion/degribbing; model run cycles and lead times (e.g., GFS/ECMWF/ICON/HRRR, ensembles); grid/lat-lon indexing and interpolation (nearest/bilinear); downscaling and bias correction; reanalysis vs. forecast datasets; spatial joins to assets/gridpoints; and correct temporal aggregation/units.

  • Lightweight model registry/feature store practices and artifact/version management.

  • Cost awareness for training/inference and basic capacity planning.

Prendre contact

Veuillez remplir le formulaire ci-dessous pour postuler à ce poste.