Position: Data Engineer
Reports To: CEO
The Opportunity:
Success Outcomes:
Design and Implement Scalable Data Solutions
You will develop back-end systems that handle large datasets and create APIs that integrate these data systems with front-end applications.
Key Responsibilities:
Build and maintain scalable ETL pipelines (Extract, Transform, Load) to process and manage data efficiently.
Design and implement data models and schemas that support business needs.
Develop and optimize RESTful APIs for seamless data flow across applications.
2. Optimize Data Performance
Key Responsibilities:
Optimize database performance using indexing, query optimization, and caching strategies.
Implement real-time data processing solutions to handle high-volume data streams effectively.
Ensure data integrity and consistency across distributed systems.
Collaborate and Learn New Technologies
Key Responsibilities:
Work closely with front-end teams to ensure smooth integration of data with the user interface.
Participate in code reviews and contribute to team knowledge sharing.
Continuously learn and adapt to emerging technologies and industry trends.
Who You Are:
- You have strong data engineering experience and development skills
- You work effectively with back-end technologies like Node.js, including data processing frameworks
- You’re constantly learning and applying new approaches to solve complex data challenges
- You thrive in environments that demand high collaboration and teamwork
- You focus on creating maintainable, scalable data solutions
Required Qualifications:
- Strong experience with back-end development, including working with Node.js, or similar technologies.
- Solid understanding of data pipelines, ETL processes, and data transformation.
- Experience designing and implementing RESTful APIs and working with databases (SQL/NoSQL).
- Proficiency in optimizing performance for large datasets, including indexing, caching, and query optimization.
- Familiarity with data visualization and integrating data into user interfaces.
- Experience with Git and collaborative development workflows.
Nice-to-Have Skills:
- Geospatial expertise (e.g., experience with PostGIS, GeoJSON, and geospatial data integration).
- Familiarity with cloud platforms (AWS, Azure) and containerization technologies (Docker, Kubernetes).
- Experience with real-time data processing and stream processing tools.
Soft Skills:
- Strong analytical and problem-solving skills, with attention to detail.
- Excellent communication skills and ability to work effectively in cross-functional teams.
- Ability to learn quickly and adapt to new technologies and methodologies.
- Strong collaboration skills and a desire to contribute to a positive team culture.