SENIOR DATA ENGINEER (WITH SPARK, PYTHON)

Apply now

COMPANY DESCRIPTION

Accesa is a leading technology company headquartered in Cluj-Napoca, with offices in Oradea, Bucharest, Timisoara and 20 years of experience in turning business challenges into opportunities and growth.

A value-driven organisation, it has established itself as a partner of choice for major brands in Retail, Manufacturing, Finance, and Banking. It covers the complete digital evolution journey of its customers, from ideation and requirements setup to software development and managed services solutions.

With more than 1,200 IT professionals, Accesa also has a fast-growing footprint, establishing itself as an employer of choice for IT professionals who are passionate about problem-solving through technology. Coming together in strong tech teams with a customer-centric approach, they enable businesses to grow, delivering value for our clients, partners, industry, and community.

About the project

Our projects can range between 8 and 20 weeks, while an account usually addresses several projects with different deliverables. We also love to get involved in any kind of AI related activities, be there in the discovery, prototyping, or implementing phase.

Often we also deliver joined-effort projects, either for internal purposes or to help customer reach their goal, relying on the collaboration with other teams: IoT, SAP, Hybris, RPA.

The projects we deliver are mainly focused on Digital Manufacturing Industry, but sometimes opportunities come from other industries such as Financial or Retail.

Your team

The team involved in delivering AI solutions and services often consists of Data Engineers, Data Scientists and Machine Learning Engineers, as part of the Delivery Team in which several other roles are present: Project Manager, Business Analyst, UX Designer, Application/DevOps Architect, Frontend and Backend Developers, QA Engineer.

JOB DESCRIPTION

As part of our Artificial Intelligence Team, you will help out shaping the future of our software.

You will develop, test and also maintain data architectures to keep this data accessible and ready for analysis. Among your tasks, you will do Data Modelling, ETL (Extraction Transformation and Load), Data Architecture Construction and Development and also Testing of the Database Architecture.

Daily responsibilities

  • Create and maintain optimal data pipeline architecture
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Cloud ‘big data’ technologies.
  • Build/use analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Real impact one step at a time

The impact will imply the project's context and will also go beyond this, with the Competence Area community that you will be part of, with a strong focus on your technical skills.

Professional Opportunities

You will have access to AI Community trainings and programs emphasizing skills on the technical and tactical side, while you will be engaged within new projects and opportunities landing in our business line.

Community insights

The community consists of Data Scientists and Machine Learning Engineers, along with Data Engineers sharing knowledge and projects' insights on a regular basis. We engage in projects pertaining to Computer Vision, NLP, Advanced Analytics, Preventions and Trends Analysis.

QUALIFICATIONS

Must have

  • 5+ years of professional experience
  • Experience in working with customer stakeholders
  • Experience working in Agile teams
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management
  • Knowledge of manipulating, processing and extracting value from large disconnected datasets
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Technical experience with:
  1. Big data tools: Spark, Databricks
  2. Stream-processing systems: Storm, Spark-Streaming, etc.
  3. Object-oriented languages: Python
  4. Visualization tools: PowerBI, Tableau, etc
  5. Data pipeline and workflow management tools: Airflow.
  6. Relational Databases: Postgres

ADDITIONAL INFORMATION

At Accesa & RARo you can:

Enjoy our holistic benefits program that covers the four pillars that we believe come together to support our wellbeing, covering social, physical, emotional wellbeing, as well as work-life fusion.

  • Physical: premium medical package for both our colleagues and their children, dental coverage up to a yearly amount, eyeglasses reimbursement every two years, voucher for sport equipment expenses, in-house personal trainer
  • Emotional: individual therapy sessions with a certified psychotherapist, webinars on self-development topics
  • Social: virtual activities, sports challenges, special occasions get-togethers
  • Work-life fusion: yearly increase in days off, flexible working schedule, birthday, holiday and loyalty gifts for major milestones, work from home bonuses

Apply now

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Location
Type
Remote
Department
Company
Accesa