A Dutch IT outsourcing company with numerous international clients and delivery centers in thre...28 December 2016
This Vacancy is currently inactive. However, should you wish to send your CV for consideration, please use the form below and when the position becomes active again, we will notify you.
Symphony Solutions – Why So Special?
Symphony Solutions is an international Dutch IT company with offices in Ukraine, Macedonia, and Poland. We have been on the market for more than 10 years already and preserve unique culture within all our locations.
At Symphony Solutions we have removed all possible barriers created by the traditional organization and embraced the organic principles and a high-degree of self-management. We believe that this kind of organization is the optimal environment to attract and retain the best talents, fully develop them and leverage their potential.
We have a unique employee selection process where colleagues choose colleagues. Such approach eliminates possible conflicts and ensures honest and transparent relationships with clients and within the team. Symphony Solutions is a company that strives to be the Best Price/Performance and the easiest to do business with.
Your role as a Big Data Engineer is to provide consulting services to our clients including planning, designing and implementing new solutions using the latest Big Data technologies. You will work on implementing and help to develop a cutting edge Big Data solutions, create data pipelines which will migrate data from customers on-prem systems and load it into a cloud-hosted Enterprise Data Platforms. Your job will also entail working on large scale custom Big Data consulting projects.
Canadian, multinational corporation that designs, implements, manages systems that directly contribute to revenue and business success. Our customer, that is specialized in Infrastructure, Data and Cloud services, helps other enterprises with all aspects of the cloud environment and deliver fully-managed solutions.
- Experience building data pipelines in any public cloud (GCP Dataflow, Glue, Azure DataFactory) or any equivalent;
- Experience writing ETL (Any popular tools);
- Experience in data modeling, data design and persistence (e.g. warehousing, data marts, data lakes);
- Strong Knowledge of Big Data architectures and distributed data processing frameworks: Hadoop, Spark, Kafka, Hive;
- Experience and working knowledge of various development platforms, frameworks and languages such as Java, Python, Scala and SQL;
- Experience with Apache Airflow, Oozie and Nifi would be great;
- General knowledge of modern data-center and cloud infrastructure including server hardware, networking and storage;
- Strong written and verbal English communication skills;
- 5+ years of experience on similar role.
Nice to have:
- Experience with BI platforms, reporting tools, data visualization products, ETL engines;
- Experience with data streaming frameworks;
- DevOps experience with a good understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc);
- Experience with Hbase;
- Experience in data management best practices, real-time and batch data integration, and data rationalization.
- Working with the Data Architects to implement data pipelines;
- Working with our Big Data Principal Architects in the development both proof of concepts and complete implementations;
- Working on complex and varied Big Data projects including tasks such as collecting, parsing, managing, analyzing, and visualizing very large datasets;
- Translating complex functional and technical requirements into detailed designs;
- Writing high-performance, reliable and maintainable code;
- Performing data processing requirements analysis;
- Performance tuning for batch and real-time data processing;
- Securing components of clients’ Big Data platforms;
- Diagnostics and troubleshooting of operational issues;
- Health-checks and configuration reviews;
- Data pipelines development – ingestion, transformation, cleansing;
- Assisting application developers and advising on efficient data access and manipulations;
- Dogfooding our product is important, so a short (~3 month) on call rotation with second level support is required for all team members. This works out to needing to be on call less than once a year and ensures quality of implementations.
- Friendly and highly professional teams;
- Competitive salary and compensation package;
- Career and professional growth;
- Regular (twice a year) performance reviews;
- Paid English classes;
- Casual Fridays, corporate events, birthday/wedding presents;
- Comfortable office facilities (kitchen, coffee machines, massage room, different trainings etc.);
- Clubs of interests (travel & bicycle club, symphony cuisine, music band, choir);
- Sports activities (football team, yoga, fitness, stretching classes);
- Low hierarchy and open communication;
- The coolest office in Western Ukraine.