SoftwareONE

Data Architect


PayCompetitive
LocationParis/Ile-De-France
Employment typeFull-Time

This job is now closed

  • Job Description

      Req#: 23645
      Why SoftwareOne?



      SoftwareOne is a leading global software and cloud solutions provider that is redefining how companies build, buy and manage everything in the cloud. By helping clients to migrate and modernize their workloads and applications – and in parallel, to navigate and optimize the resulting software and cloud changes – SoftwareOne unlocks the value of technology. The company’s 8,900 employees are driven to deliver a portfolio of 7,500 software brands with sales and delivery capabilities in 90 countries. Headquartered in Switzerland, SoftwareOne is listed on the SIX Swiss Exchange under the ticker symbol SWON. Visit us at https://www.softwareone.com/en


      The role

      Join our team as a Data Architect, your role will be crucial in our Data an AI projects!

      What will you do?


      You will be responsible for designing and establishing the appropriate data architecture for our cloud projects (typically GCP). It involves understanding business and technical requirements, identifying data storage, processing and analytics needs, and designing scalable and efficient solutions using GCP cloud services.

      You will work closely with the team of Data Engineers and Data Scientists to understand customer requirements and challenges. Provide technical guidance and advice on the implementation of data solutions, ensuring industry best practices and standards are followed.

      Evaluate and select the most appropriate GCP services and tools for each project. Understand the strengths and limitations of cloud services, as well as keep up-to-date on the latest trends and features in the field of cloud engineering and data science.

      Design and develop efficient and reliable data pipelines using the data processing and storage capabilities of GCP. This includes integration of different data sources, data transformation, data cleansing and loading into data warehouses or analytics platforms.

      Ensure that implemented data solutions meet security and compliance standards. Work closely with security and compliance teams to implement data security and privacy best practices.

      Provide technical advice to the team of Data Engineers and Data Scientists in the selection of algorithms, data modelling techniques and analysis strategies. Participate in code reviews and ensure the quality and robustness of the implemented solutions.

      Be in permanent contact with third party solutions that complement the Data & AI ecosystem, ensuring knowledge of them within the team.


      What we need to see from you

      Do you have experience working in Data and AI?
      We are looking for you!

      To join our Data Team at SoftwareOne France, we are looking for someone with experience and/or knowledge in,

      The principles and best practices of data architecture design, including the selection of appropriate data models, the choice of data warehouses and the implementation of efficient data pipelines.

      The services and tools offered by GCP (Google Cloud Platform). Data warehousing services such as S3, Cloud Storage and BigQuery, data processing services Cloud Dataflow, as well as analytics and machine learning services such as GCP BigQuery ML.

      Programming languages: Python or Scala Experience. Ability to write scripts and develop applications for data manipulation and processing.

      Database technologies: Familiarity with different types of databases, such as relational databases (e.g. MySQL, PostgreSQL) and NoSQL databases (e.g. MongoDB, Cassandra). Understand the performance, scalability and data modelling issues associated with different types of databases.

      ETL/ELT and data processing: Knowledge of extract, transform and load (ETL) or extract, load and transform (ELT) techniques and tools used to manipulate and transform data. Experience using tools such as GCP Dataflow or Apache Spark for distributed data processing.

      Data analysis and visualisation: Understanding of data analytics concepts and techniques, including the use of popular libraries and tools such as pandas, NumPy, Tableau, Quicksight, Looker or Power BI for data analysis and visualisation.

      Security and compliance: Knowledge of security practices and measures to protect sensitive data and ensure regulatory compliance. Understanding how to apply encryption techniques, secure data access and compliance with regulations such as GDPR.

      Best practices and standards: Knowledge of best practices and standards in the field of data engineering and data science, including data quality management, data governance, code documentation and collaborative teamwork.

      Communication skills:
      Ability to communicate effectively with technical and non-technical team. Be able to present technical ideas and solutions clearly and concisely, and have skills to guide discussions and collaborate with other team members.

      We value very positively:

      Use of tools like Jira, Git, Confluence is very important.

      #LI-EO1


      Job Function

      Software & Cloud Services
      Accommodations

      SoftwareOne welcomes applicants from all backgrounds and abilities to apply. If you require reasonable adjustments at any point during the recruitment process, email us at reasonable.accommodations@softwareone.com.

      Please include the role for which you are applying and your country location. Someone from our organization that is not part of the decision-making process will be in touch to discuss your specific needs and we will make every effort to accommodate you. Any information shared will be stored securely and treated in the strictest of confidence in line with GDPR.


      Do you have experience working in Data and AI?
      We are looking for you!

      To join our Data Team at SoftwareOne France, we are looking for someone with experience and/or knowledge in,

      The principles and best practices of data architecture design, including the selection of appropriate data models, the choice of data warehouses and the implementation of efficient data pipelines.

      The services and tools offered by GCP (Google Cloud Platform). Data warehousing services such as S3, Cloud Storage and BigQuery, data processing services Cloud Dataflow, as well as analytics and machine learning services such as GCP BigQuery ML.

      Programming languages: Python or Scala Experience. Ability to write scripts and develop applications for data manipulation and processing.

      Database technologies: Familiarity with different types of databases, such as relational databases (e.g. MySQL, PostgreSQL) and NoSQL databases (e.g. MongoDB, Cassandra). Understand the performance, scalability and data modelling issues associated with different types of databases.

      ETL/ELT and data processing: Knowledge of extract, transform and load (ETL) or extract, load and transform (ELT) techniques and tools used to manipulate and transform data. Experience using tools such as GCP Dataflow or Apache Spark for distributed data processing.

      Data analysis and visualisation: Understanding of data analytics concepts and techniques, including the use of popular libraries and tools such as pandas, NumPy, Tableau, Quicksight, Looker or Power BI for data analysis and visualisation.

      Security and compliance: Knowledge of security practices and measures to protect sensitive data and ensure regulatory compliance. Understanding how to apply encryption techniques, secure data access and compliance with regulations such as GDPR.

      Best practices and standards: Knowledge of best practices and standards in the field of data engineering and data science, including data quality management, data governance, code documentation and collaborative teamwork.

      Communication skills:
      Ability to communicate effectively with technical and non-technical team. Be able to present technical ideas and solutions clearly and concisely, and have skills to guide discussions and collaborate with other team members.

      We value very positively:

      Use of tools like Jira, Git, Confluence is very important.

      #LI-EO1


      Join our team as a Data Architect, your role will be crucial in our Data an AI projects!

      What will you do?


      You will be responsible for designing and establishing the appropriate data architecture for our cloud projects (typically GCP). It involves understanding business and technical requirements, identifying data storage, processing and analytics needs, and designing scalable and efficient solutions using GCP cloud services.

      You will work closely with the team of Data Engineers and Data Scientists to understand customer requirements and challenges. Provide technical guidance and advice on the implementation of data solutions, ensuring industry best practices and standards are followed.

      Evaluate and select the most appropriate GCP services and tools for each project. Understand the strengths and limitations of cloud services, as well as keep up-to-date on the latest trends and features in the field of cloud engineering and data science.

      Design and develop efficient and reliable data pipelines using the data processing and storage capabilities of GCP. This includes integration of different data sources, data transformation, data cleansing and loading into data warehouses or analytics platforms.

      Ensure that implemented data solutions meet security and compliance standards. Work closely with security and compliance teams to implement data security and privacy best practices.

      Provide technical advice to the team of Data Engineers and Data Scientists in the selection of algorithms, data modelling techniques and analysis strategies. Participate in code reviews and ensure the quality and robustness of the implemented solutions.

      Be in permanent contact with third party solutions that complement the Data & AI ecosystem, ensuring knowledge of them within the team.

  • About the company

      SoftwareONE helps clients govern and manage software estate – be it licensing optimization, procuring effectively, or deploying a cloud-based solution.

Notice

Talentify is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status.

Talentify provides reasonable accommodations to qualified applicants with disabilities, including disabled veterans. Request assistance at accessibility@talentify.io or 407-000-0000.

Federal law requires every new hire to complete Form I-9 and present proof of identity and U.S. work eligibility.

An Automated Employment Decision Tool (AEDT) will score your job-related skills and responses. Bias-audit & data-use details: www.talentify.io/bias-audit-report. NYC applicants may request an alternative process or accommodation at aedt@talentify.io or 407-000-0000.