Farmerline IT Jobs in Ghana
1. Patiently scroll down and read the job description below.
2. Scroll down and find how to apply or mode of application for this job after the job description.
3. Carefully follow the instructions on how to apply.
4. Always apply for a job by attaching CV with a Cover Letter / Application Letter.
-->
As part of business growth and expansion, Farmerline is looking for a Data Engineer to join our growing team of data experts. The hire will be responsible for expanding and optimizing our databases and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our Software Developers, BI Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout the organization and its ongoing.
They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
ADVERTISEMENT - CONTINUE READING BELOW
-->
Responsibilities
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Mongo and Azure big data technologies.
- Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
- Collaborates with Engineering and Business units to improve data models that feed business intelligence tools and our products increasing data accessibility for both our internal and external stakeholders and fostering data-driven decision making across the organization.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Performs data analysis required to troubleshoot data-related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product managers, and analysts.
- Designs data integrations and data quality framework.
- Designs and evaluates open source and vendor tools for data lineage.
- Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture
Required Qualifications
- BS or MS degree in Computer Science or a related technical field
- 2+ years of Python & Java development experience
- 3+ years of SQL experience (No-SQL experience is a plus)
- 3+ years of experience with schema design and dimensional data modelling
- Ability to manage and communicate data warehousing plans to internal stakeholders
- Experience designing, building, and maintaining data processing systems
- 2+ years of experience working with Azure and other cloud technologies
- Experience with Hadoop, Apache Airflow and Spark is a plus
ADVERTISEMENT - CONTINUE READING BELOW
-->
« Go back to the jobs list
-->
1. Patiently scroll down and read the job description below.
2. Scroll down and find how to apply or mode of application for this job after the job description.
3. Carefully follow the instructions on how to apply.
4. Always apply for a job by attaching CV with a Cover Letter / Application Letter.
As part of business growth and expansion, Farmerline is looking for a Data Engineer to join our growing team of data experts. The hire will be responsible for expanding and optimizing our databases and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.
The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our Software Developers, BI Analysts and Data Scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout the organization and its ongoing.
They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.
ADVERTISEMENT - CONTINUE READING BELOW
Responsibilities
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Mongo and Azure big data technologies.
- Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.
- Collaborates with Engineering and Business units to improve data models that feed business intelligence tools and our products increasing data accessibility for both our internal and external stakeholders and fostering data-driven decision making across the organization.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Performs data analysis required to troubleshoot data-related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product managers, and analysts.
- Designs data integrations and data quality framework.
- Designs and evaluates open source and vendor tools for data lineage.
- Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture
Required Qualifications
- BS or MS degree in Computer Science or a related technical field
- 2+ years of Python & Java development experience
- 3+ years of SQL experience (No-SQL experience is a plus)
- 3+ years of experience with schema design and dimensional data modelling
- Ability to manage and communicate data warehousing plans to internal stakeholders
- Experience designing, building, and maintaining data processing systems
- 2+ years of experience working with Azure and other cloud technologies
- Experience with Hadoop, Apache Airflow and Spark is a plus
ADVERTISEMENT - CONTINUE READING BELOW
« Go back to the jobs list