عن الشركة/ المؤسسة
منصة إلكترونية تعمل على تشبيك الشباب في الفرص المتاحة بشكل محلي وبشكل عالمي.
Key Responsibilities:
- Identify and integrate data sources to meet business stakeholder requirements.
- Address data quality issues in collaboration with data source owners.
- Deploy advanced algorithms, analytics programs, and statistical methods.
- Perform Extract, Transform, Load (ETL) operations to prepare data for analysis.
- Design and maintain scalable data pipelines in Azure cloud environment.
- Integrate new data management technologies and tools.
- Build datasets to support data-driven insights and decision-making.
- Orchestrate workflows using Apache Airflow for efficient data pipeline execution.
- Develop customized software components and analytics applications.
- Collaborate with colleagues to ensure a flexible and scalable data platform.
- Maintain project documentation.
Requirements:
Requirements & Preferred Certifications:
- Bachelor's and/or Master’s degree in Computer Science or related field.
- Relevant certifications are preferred.
Preferred Skillset:
- 5+ years of experience in data engineering roles, with a proven track record of delivering complex projects.
- Familiarity with data architecture and big data technologies.
- 5+ years of experience in Python(OpenCV, Pandas, NumPy, PySpark), C#(Entity framework and API Integration) , and web frameworks(.NET).
- 5+ years of experience in Databases (SQL: Microsoft SQL Server, PostgreSQL, MySQL; NoSQL: MongoDB, Elasticsearch)
- 2+ years of experience in Apache Airflow.
- Technical writing experience for documentation.
- Alteryx experience is plus.
- Experience with machine learning is plus.
- Data Visualization Tools: Tableau, PowerBI experience is a plus.
Application Process:
Interested candidates can apply by filling out the form through the "Apply Now" button above.
شارك المدونة مع الأصدقاء
المزيد من الفرص
دعم الشركات الناشئة والفرق التقنية الفلسطينية
تصفح الفرص
ابق على اطلاع بالفرص عبر البريد الإلكتروني
اشترك في نشرتنا الإخبارية!