Big Data Architect in Skopje, Macedonia

We are looking for a Big Data Architect to join our rapidly growing development team in Skopje

Find out more
go back

Symphony – Why So Special?

At Symphony Solutions we have removed all possible barriers created by the traditional organization and embraced the organic principles and a high-degree of self-management. We believe that this kind of organization is the optimal environment to attract and retain the best talents, fully develop them and leverage their potential.
We have a unique employee selection process where colleagues choose colleagues. Such approach eliminates possible conflicts and ensures honest and transparent relationship with clients and within the team. Symphony Solutions is a company that strives to be the Best Price/Performance and the easiest to do business with.
Symphony Solutions in Skopje is currently looking for a Big Data Architect for a full-time position, who will become a contributor to the business transformation process of the long-term success and growth of the company.

General requirements:

  • A strong desire to work in a customer facing consulting type role;
  • Experience Architecting Big Data platforms using Apache Hadoop distros like, Cloudera, Hortonworks and MapR;
  • Strong knowledge of cloud data processing architectures;
  • In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse, Teradata, Redshift, BigQuery, Snowflake etc;
  • Demonstrated knowledge of data warehouse concepts;
  • Strong understanding of distributed systems architecture;
  • Be fluent in at least 2 object oriented language, preferably java and python, and have familiarity with functional languages and be proficient in SQL;
  • Experience with Hive and Impala;
  • Experience with major distributed processing frameworks: Mapreduce, Spark, Storm, Flin;
  • Experience with Kafka and Spark Streaming;
  • Ability to work with software engineering teams and understand complex development systems and patterns;
  • Ability to travel and work across North America frequently (occasionally on short notice) up to 50% of the time with some international travel also expected.

Nice to have:

  • Experience with BI platforms, reporting tools, data visualization products, ETL engines;
  • Real-time Hadoop query engines like Dremel, Cloudera Impala, Facebook Presto or Berkley Spark/Shark;
  • DevOps experience with a good understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc);
  • Experience with Hbase.

Main responsibilities:

  • Collaborating with and supporting our client's sales teams in the pre-sales & account management process from the technical perspective, remotely and on-site for Big Data focused consulting projects;
  • Travelling to client sites frequently to provide face to face consulting services, meetings and presentation as well as delivering from work and value home
  • Conducting technical audits of client’s existing architectures (Infrastructure, Performance, Security, Scalability and more) document best practices and recommendations;
  • Lead and meet the expectations for client deliverable completion timelines;
  • Work with Pythian’s clients to help them develop data architecture models to further enable effective service oriented delivery;
  • Providing component or site-wide performance optimizations and capacity planning;
  • Mentoring and working with both internal and client development teams directly delivering against an agile backlog.to help engineer highly available, more manageable Big Data platforms.

Send us your CV using the form below

Upload CV

Meet our recruiters