21 Big Data jobs in Oman
Lead Data Management
Posted today
Job Viewed
Job Description
Execute end-to-end process for the maintenance of all existing Subsurface Databases
Execute end-to-end process for well and reservoir data in share drives
Work with Vendors to receive, QC, store and publish the relevant Well, Log and Reservoir data
Ensure and participate in the development and implementation of the upstream technical data operating model/data frameworks (i.e. governance, quality, etc.)/guidelines, roadmaps / data standards / best practices while maintaining the highest standards of data quality, accessibility, extraction and connectivity to the upstream Petro technical applications.
Define and govern the data stewardship for the company upstream technical data to ensure compliance with data governance, enhance data quality and ensure clear accountability for stewardship of the company data assets.
Participate in the development of the upstream technical data strategy, governance initiatives/activities (policies, procedures, structures, roles and responsibilities) to serve the upstream data users across OQEP and to maximize the upstream technical data full potential.
Promote across the organization awareness of general data management principles through knowledge sharing channels like awareness sessions, training, etc.
Develop and execute data science projects using machine learning and other data science analysis techniques
Work with vendors on new technologies related to data analysis and AI
Desired Candidate ProfileBachelor s degree in computer science/information technology/ or engineering preferably in subsurface discipline, postgraduate degree is preferable.
8+ years of experience in information technology and relevant business disciplines
Experience in Document management applications like Documentum D2
Experience in WebGIS application
Experience in handling large data sets
Experience in handling live data sources
Experience in handling different data types
Experience with machine learning, neural networks and AI
Soft:
- Detail-oriented.
- Interpersonal skills.
- Reliable.
- Problem-solving skills.
- Organizational skills.
- Strong written and verbal communication skills
- Presentation skills.
- Encourage engagement.
- Be organized.
- Interest in helping businesses and organizations succeed through innovation and fresh thinking.
- Ability to easily digest and simplify complex concepts into easy-to-understand summaries.
- A flexible attitude to work assignments.
Technical:
- Computer Literacy including MS Office
- Experience in Python and/or other coding languages
- Experience in managing large datasets
- SAP experience is a Plus
- APEX/SQL experience is a Plus
- C++ and Java experience is a major plus
- Experience Mobile APP development is major plus
Senior Data Management Consultant
Posted today
Job Viewed
Job Description
- Lead implementation of Data Catalogue / Data Dictionary, capturing core data elements, metadata, and lineage.
- Review and sanitize transaction codes mapping across multiple systems, ensuring compliance with regulatory and business standards.
- Assess and optimize current data warehouse, marts, and reporting pipelines for performance, scalability, and redundancy.
- Define and implement best practices in data modeling, indexing, storage, and governance.
- Liaise with business and technical teams to ensure alignment and adoption.
- Provide documentation, training, and stakeholder communication.
- 8–10 years in data governance, data architecture, and metadata management.
- Strong knowledge of data catalog tools (Collibra, Alation, Informatica, or open-source equivalents).
- Expertise in data warehousing and BI (ETL, dimensional modeling, SQL performance tuning).
- Familiarity with banking and regulatory data requirements (IFRS, AML/KYC, CBO reporting).
- Strong stakeholder engagement and communication skills.
Retail Data Analysis(KA)(A140940)
Posted 4 days ago
Job Viewed
Job Description
- Responsible for the sales data analysis of KA channels
- Through data analysis, identify the business issues of the KA channel in various countries and communicate with distributor and sales to find solutions.
- Responsible for coordinating the access of various products to KA channels in different countries and following up on sales performance.
- More than three years of working experience in the mobile phone or consumer electronics industry, with experience in Qatar or Oman preferred.
- Have experience in sales management of KA channels and familiar with the business model of KA channels.
- Proficient in using Excel software for data analysis.
Retail Data Analysis(KA)(A140940)
Posted today
Job Viewed
Job Description
1.Responsible for the sales data analysis of KA channels
2.Through data analysis, identify the business issues of the KA channel in various countries and communicate with distributor and sales to find solutions.
3.Responsible for coordinating the access of various products to KA channels in different countries and following up on sales performance.
1.More than three years of working experience in the mobile phone or consumer electronics industry, with experience in Qatar or Oman preferred.
2.Have experience in sales management of KA channels and familiar with the business model of KA channels.
3.Proficient in using Excel software for data analysis.
Data Science Engineer
Posted 3 days ago
Job Viewed
Job Description
- Bachelor's degree in Computer Science or a related field.
- 1-2 years of experience in data analysis or data engineering.
Skills / Knowledge:
- Familiarity with programming languages like Python, R, or SQL.
- Solid understanding of statistical and machine learning concepts.
- Strong problem-solving and analytical skills.
- Excellent data visualization and presentation skills.
- Excellent communication and technical documentation abilities.
- Strong leadership and mentoring abilities.
- Ability to work collaboratively and learn quickly.
- Well-developed interpersonal skills and excellent communications skills in English.
- Respect & Integrity
- Problem Solving & Decision Making
The Junior Data Science Engineer supports data analysis projects by building and maintaining models, pipelines, and tools under the guidance of senior team members.
Key Accountabilities & Responsibilities- Assist in designing, developing, and implementing advanced data models and predictive analytics to solve business problems and drive decision-making.
- Build, maintain, and optimize scalable and efficient data pipelines to facilitate seamless data integration and processing across the organization.
- Conduct comprehensive exploratory data analysis to uncover insights, trends, and patterns, and present findings in a clear and actionable manner.
- Support the implementation and optimization of machine learning algorithms, ensuring they meet performance, accuracy, and scalability requirements.
- Work closely with cross-functional teams, including business stakeholders and senior data scientists, to understand requirements and deliver tailored data-driven solutions.
- Develop and maintain thorough documentation of data processes, models, and methodologies to ensure reproducibility and transparency.
- Prepare detailed reports and dashboards to communicate insights and progress to stakeholders.
- Stay up-to-date with emerging trends, tools, and methodologies in data science and analytics.
- Actively contribute to improving teamwork flows and adopting innovative approaches to solving challenges.
IoT Data Engineer
Posted 9 days ago
Job Viewed
Job Description
Overview
Canonical is a leading provider of open source software and operating systems to the global enterprise and technology markets. Our platform, Ubuntu, is very widely used in breakthrough enterprise initiatives such as public cloud, data science, AI, engineering innovation, and IoT. Our customers include the world's leading public cloud and silicon providers, and industry leaders in many sectors. The company is a pioneer of global distributed collaboration, with 1200+ colleagues in 75+ countries and very few office-based roles. Teams meet two to four times yearly in person, in interesting locations around the world, to align on strategy and execution. The company is founder-led, profitable, and growing. This is an exciting opportunity for a software engineer passionate about open source software, Linux, and Web Services at scale. Come build a rewarding, meaningful career working with the best and brightest people in technology at Canonical, a growing pre-IPO international software company.
Canonical's engineering team is at the forefront of the IoT revolution and aims to strengthen this position by developing cutting-edge telemetry and connectivity solutions. By integrating reliable, secure, and robust data streaming capabilities into the Snappy ecosystem, we are setting new standards in the industry for ease of development, implementation, management and security. We are seeking talented individuals to help us enhance our global SaaS services, providing customers with the essential data services needed to build the next generation of IoT devices effortlessly. Our commitment to data governance, ownership, and confidentiality is unparalleled, ensuring our customers can innovate with confidence on top of the globally trusted Ubuntu platform.
Location: This role will be based remotely in the EMEA region.
What your day will look like- Work remotely with a globally distributed team, driving technical excellence and fostering innovation across diverse engineering environments.
- Design and architect high-performance service APIs to power streaming data services, ensuring seamless integration across teams and products using Python and Golang.
- Develop robust governance, auditing, and management systems within our advanced telemetry platform, ensuring security, compliance, and operational integrity.
- Partner with our infrastructure team to build scalable cloud-based SaaS solutions while also delivering containerized on-prem deployments for enterprise customers.
- Lead the design, implementation, and optimization of new features—taking projects from spec to production, ensuring operational excellence at scale.
- Provide technical oversight, review code and designs, and set best practices to maintain engineering excellence.
- Engage in high-level technical discussions, collaborating on optimal solutions with engineers, product teams, and stakeholders.
- Work remotely with occasional global travel (2-4 weeks per year) for internal and external events, fostering deeper collaboration and knowledge-sharing.
- You design and architect scalable backend services, messaging/data pipelines, and REST APIs using Golang or Python, guiding best practices, technical direction, and system scalability.
- You possess deep expertise in cybersecurity principles and proactively address the complex challenges of IoT environments—secure connectivity, data streaming, governance, and compliance.
- You bring proven expertise in designing and optimizing systems using:
- IAM models, encryption, access control, and compliance frameworks (GDPR, HIPAA) to ensure secure and compliant data handling.
- Ability to design decentralized data ownership models, ensuring interoperability and governance across domains.
- Designing high-throughput, low-latency systems for IoT data processing.
- Data streaming technologies (MQTT, Kafka, RabbitMQ)
- Observability tools (OpenTelemetry)
- Industrial/engineering data exchange protocols (OPC-UA, ModBus)
- You thrive in cross-functional environments, partnering with product teams, engineers, and stakeholders to drive high-impact technical solutions that align with business objectives.
- You mentor junior engineers, foster technical excellence, and contribute to a culture of innovation, continuous improvement, and knowledge sharing.
- You embrace challenges with an open mind, continuously seeking opportunities to learn, improve, and innovate in a rapidly evolving IoT landscape.
- You are familiar with Ubuntu as a development and deployment platform.
- You hold a Bachelor's degree or equivalent in Computer Science, STEM, or a related field.
- Willingness to travel up to 4 times a year for internal events.
We consider geographical location, experience, and performance in shaping compensation worldwide. We revisit compensation annually (and more often for graduates and associates) to ensure we recognize outstanding performance. In addition to base pay, we offer a performance-driven annual bonus or commission. We provide all team members with additional benefits which reflect our values and ideals. We balance our programs to meet local needs and ensure fairness globally.
- Distributed work environment with twice-yearly team sprints in person
- Personal learning and development budget of USD 2,000 per year
- Annual compensation review
- Recognition rewards
- Annual holiday leave
- Maternity and paternity leave
- Team Member Assistance Program & Wellness Platform
- Opportunity to travel to new locations to meet colleagues
- Priority Pass and travel upgrades for long-haul company events
About Canonical
Canonical is a pioneering tech firm at the forefront of the global move to open source. As the company that publishes Ubuntu, one of the most important open-source projects and the platform for AI, IoT, and the cloud, we are changing the world of software. We recruit on a global basis and set a very high standard for people joining the company. We expect excellence; in order to succeed, we need to be the best at what we do. Most colleagues at Canonical have worked from home since our inception in 2004. Working here is a step into the future and will challenge you to think differently, work smarter, learn new skills, and raise your game.
Canonical is an equal opportunity employer
We are proud to foster a workplace free from discrimination. Diversity of experience, perspectives, and background create a better work environment and better products. Whatever your identity, we will give your application fair consideration.
#J-18808-LjbffrPython Data Engineer
Posted today
Job Viewed
Job Description
Apt Resources is hiring a Python Data Engineer for our client in the banking sector. This role focuses on Big Data architectures and data warehousing, with hands-on involvement in tools like Tableau, Teradata, NIFI, and IBM DataStage.
Key Responsibilities:
- Design scalable Big Data and DWH architectures.
- Develop KPIs and dashboards to analyze market trends.
- Work with Tableau for data visualization and reporting.
- Perform data migration using NIFI and ETL with IBM DataStage.
- Collaborate with teams and vendors on data-driven initiatives.
- Maintain data privacy, security, and regulatory compliance.
- 3 to 4 years of experience in data engineering or related roles.
- Bachelor's degree in Engineering or a related field.
- Certifications in Big Data or Data Science are preferred.
- Proficiency in SQL, Python, PySpark, and Shell scripting (Bash).
- Experience with data visualization tools like Tableau.
- Familiarity with ETL tools such as IBM DataStage and data pipelines using NIFI.
- Knowledge of REST, JSON, SOAP, and Web Services APIs.
- Strong analytical, communication, and project leadership skills.
- Experience in telecom data (2G/3G) is an advantage.
Salary: USD 2,000 - 2,500 per month
Be The First To Know
About the latest Big data Jobs in Oman !
Senior Data Engineer
Posted today
Job Viewed
Job Description
Oman Air has built up a reputation as a strong, competitive leader in the airline industry. We are committed to recruiting and nurturing bright and dynamic individuals to meet our manpower needs. In the new millennium, our mission is to seek out new ways to develop and improve our position as a leader in aviation excellence.
We believe our people are the reason behind our success and we offer you a once in a lifetime opportunity to work in a team-based customer-oriented environment. Our emphasis is on continual staff development we achieve through the training we impart to our staff members.
Role Objective
- To enable data driven decision making across the airline.
- You assist by designing and implementing comprehensive data quality, performance, and governance solutions throughout our data pipeline and data engineering processes.
Duties and Responsibilities
Area of responsibility
- As a Senior Data Engineer, you will work across the following data pillars:
- Ensure data integrity and quality.
- Develop, maintain and optimize data pipelines.
- Assemble and manage large, complex data sets to meet both functional and non-functional business requirements
- Enable broad organizational data access, including catalogue, validation and automation aspects.
- Familiar with data extraction and transformation from a wide variety of data sources and formats.
- Investigate and troubleshoot data-related problems such as data pipeline failures by utilizing tools, logs, and monitoring systems to identify root causes and implement solutions
- Identify internal process improvements such for scalability, data delivery optimization, automation of manual processes, and report these to relevant stakeholders
- Handle complex data engineering tasks aimed at creating and enhancing frameworks that promote automation and reusability
- Apply strategic and analytical skills to address customer and business-related questions, leveraging data to inform decisions. Including consultation and collaboration with business stakeholders.
- Ability to work under pressure, whilst balancing delivery speed, reliability and interpretability.
- Keep abreast of latest data engineering trends and company policies, ensuring all activities adhere to relevant rules and regulations
- Oversee the development of reusable components, frameworks, and libraries at scale
- Collaborate with cross-functional teams, including aviation safety experts, route planners, customer experience professionals, and business stakeholders, to understand their data needs, provide data solutions, and enable data-driven decision-making to enhance airline safety, efficiency, and customer satisfaction.
- Assist with the implementation of data quality standards, data governance practices, and data security measures to maintain the accuracy, reliability, and compliance of airline data, particularly in relation to safety and regulatory requirements
- Manage the administration and support aspects of core Data Engineering platforms
- Perform any other related tasks as assigned by the Management.
Education & Experience
- Bachelor's degree in a Business, IT, Mathematics, Science or Engineering discipline with 04 years or relevant work experience. A post-graduate degree in a relevant field would be an asset.
- Two years college diploma in a Business, IT, Mathematics, Science or Engineering discipline with 06 years relevant work experience.
- Specialized certificate/ license in the related field with aviation experience & Secondary school certificate and having 10 years of relevant work experience (For Internal Candidates Only).
- Airline or logistics experience would be advantageous
Special Skills & Knowledge
- Proven knowledge and expertise in database technologies, data warehousing concepts, and cloud platforms.
- Proficiency in English (must) and Arabic (preferred).
- Technical knowledge of Big Data and Data Warehouses, including Oracle, Microsoft ADLS, Synapse, and PowerBI.
Data Science Manager
Posted today
Job Viewed
Job Description
SWATX is seeking a highly skilled and experienced Data Science Manager to lead our growing data science team. In this strategic role, you will be responsible for overseeing the development and implementation of data-driven solutions to solve complex business challenges. You will mentor and guide a team of data scientists, driving innovation and excellence in analytics and machine learning. If you are a strong leader with a passion for data science and a proven track record of delivering impactful solutions, we invite you to join us.
Responsibilities:
- Lead and mentor a team of data scientists, providing guidance on best practices in data analysis, machine learning, and statistical modeling
- Develop and execute the data science strategy aligned with business objectives, ensuring that data-driven insights are integrated into decision-making processes
- Oversee the design and implementation of innovative data science projects that drive value for the organization
- Collaborate with cross-functional teams to identify opportunities for leveraging data to improve products, services, and operational efficiency
- Build and maintain strong relationships with stakeholders, understanding their data needs and ensuring timely delivery of insights
- Monitor and evaluate the performance of data science models and adjust strategies as necessary to achieve desired results
- Promote a data-driven culture within the organization by communicating the value of data science initiatives to stakeholders at all levels
- Stay updated on the latest trends and developments in data science and analytics, and integrate new methodologies and tools as appropriate
- Bachelor's or Master's degree in Data Science, Computer Science, Statistics, Mathematics, or a related field
- Proven experience in a data science role, with at least 5+ years of experience, including 2+ years in a managerial or leadership position
- Strong proficiency in programming languages such as Python, R, and experience with data manipulation and analysis libraries
- Solid understanding of machine learning algorithms, statistical methodologies, and data modeling techniques
- Experience with data visualization tools (e.g., Tableau, Power BI) to communicate findings effectively
- Excellent project management skills and ability to prioritize tasks in a fast-paced environment
- Strong analytical and problem-solving skills with attention to detail
- Exceptional communication skills, both verbal and written, in English and Arabic
- Proven capability to drive collaboration across teams and influence senior stakeholders
- Certified Data Scientist (CDS)
- Microsoft Certified: Azure Data Scientist Associate
- Google Cloud Professional Data Engineer
Data Infrastructure Engineer
Posted 10 days ago
Job Viewed
Job Description
This role is exclusively for Omani nationals and requires candidates to be Omani citizens.
We are looking for a talented and experienced Data Infrastructure Engineer to join our team. This role focuses on building, deploying, maintaining, and optimizing data infrastructure that supports our data-driven operations. You will work with large-scale data processing systems, ensuring they are robust, scalable, and high-performing in production environments. The ideal candidate has strong technical skills in distributed data systems, particularly in deploying and managing these systems, and is passionate about creating efficient data workflows that drive actionable insights.
Key responsibilities- Design, develop, and maintain scalable data infrastructure solutions to support large-scale data processing and analytics.
- Deploy and configure distributed data systems, including data storage (e.g., HDFS, cloud storage) and data processing frameworks (e.g., Hadoop, Spark), ensuring they are resilient, optimised, and production-ready.
- Build and automate ETL workflows, managing data extraction, transformation, and loading processes to ensure data quality, consistency, and availability.
- Monitor the health and performance of data infrastructure, proactively troubleshooting and resolving performance bottlenecks and operational issues to maintain system stability.
- Optimise and scale infrastructure, leveraging containerisation (e.g., Docker) and orchestration (e.g., Kubernetes) to manage resources efficiently and support growing data volumes.
- Implement data governance practices and enforce best practices for data handling, quality, security, and compliance.
- Collaborate with data engineers, analysts, and cross-functional teams to understand data requirements and support data accessibility across the organisation.
- Stay updated on the latest trends and tools in data infrastructure, evaluating and recommending new technologies to enhance our capabilities.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 2+ years of experience in building, deploying, and managing data infrastructure, particularly in distributed data systems.
- Proficiency in big data technologies, including Hadoop, Spark, Hive, or related frameworks.
- Strong programming skills in languages such as Python, Java, or Scala.
- Hands-on experience with cloud platforms (e.g., Google Cloud, AWS, Azure) and cloud storage solutions.
- Knowledge of data formats such as Parquet, Avro, or ORC, and data querying tools like HiveQL.
- Familiarity with data pipeline orchestration tools, such as Apache Airflow or Luigi.
- Excellent problem-solving skills and the ability to work collaboratively within a team environment.
- Experience with additional data engineering tools like Apache Kafka, Apache NiFi, or Flume.
- Telecom industry experience, particularly in building data solutions for network performance or customer analytics.
- Familiarity with DevOps practices and CI/CD pipelines in a data engineering environment.
- Competitive Compensation : A comprehensive salary package with performance incentives.
- Career Development : Access to ongoing training, certifications, and professional development resources.
- Health and Wellness : Full health insurance, retirement plans, and wellness programs.
- Innovative Projects : Opportunities to work on high-impact data solutions with a talented, driven team.
- Cutting-Edge Environment : A collaborative culture that values creativity, learning, and innovation.
- Shape the future of digital ecosystems: Be part of a team that's redefining digital ecosystems management to make it intelligent, adaptive, and capable of supporting future demands.
- Innovate for impact: Work on cutting-edge technologies like AI, IoT, and data analytics to address real-world challenges in infrastructure.
- Empower smart cities: Contribute to building the foundation for cognitive cities - urban environments that are resilient, efficient, and adaptable.
- Grow with us: Join a dynamic, mission-driven team that values collaboration, innovation, and growth. We are committed to creating a workplace where you can thrive, learn, and make a meaningful impact.