We all know that the amount of unstructured data - emails, documents, social media, web content, images and video - is exploding. We also know that generating actionable intelligence from this data is time consuming and inefficient.
We think people should spend time doing the stuff that people do well, and let the machines do the stuff that machines do better. We apply our expertise to tackle our customers’ challenges, freeing people to do what they do faster and better. We build our AI and ML technology into products that are easy to deploy, manage and use. We want all our tools to augment and empower knowledge-intensive processes and to help our customers efficiently and effectively analyse large volumes of data.
Our customers are engaged in some of the most important and challenging work in the UK and around the globe, analysing billions of data points each day. The volume and complexity of their data is growing rapidly and represents a significant opportunity for us to make a positive and significant impact for our customers.
We have a small and growing team with decades of experience applying ML and AI in the real world. Our team specialise in Natural Language Processing, Complex Network and Graph Theory, Time Series Analysis, CI/CD Cloud Solutions, Distributed System Architectures and Microservices. We have senior bankers and junior developers, PhDs and self-taught hackers. Whatever your background, we hope you would like to find a home in our inclusive and diverse team.
We pride ourselves on our close academic relationships with many of the UK’s leading software engineering and ML universities where we aim to apply cutting-edge research as soon as it is viable to real world, client-led problems. We hold quarterly hackathons, where we take a break from the sprint rhythm, take some of those blue-sky ideas, and encourage ourselves to make something happen!
We are looking for a Principal Engineer to join our architecture delivery team, helping to lead our efforts to engineer the world’s best platform to deliver unstructured content and NLP oriented solutions to our our users.
You will spend most of your time designing the architecture you will be then coding, and helping others to build. As the Principal engineer on our system team, you will be looked to innovate, mentor and lead by example to other team member in creating a culture of engineering excellence.
You will be building microservices to run in our Kubernetes infrastructure, providing adarga_engineTM endpoints for use by both adarga_benchTM and our customers. You will also be creating and reviewing design documents as well as code, and - alongside your team - taking ownership of the products you ship into production.
Adarga is one of the few companies in the UK making Artificial Intelligence work on the ground, today, in real products. If you have a passion for making products that analyse vast amounts of data faster, more accurately and more effectively, then why not help us build a better data future?
· Proven experience having designed and built a microservice back ecosystem
· Experience in writing software in one or more languages such as Java, C++, Python, GO, etc
· Deep experience with Microservice design and implementation on the Cloud
· Strong communication skills and ability to lead and push forward initiatives to create value for the business
· Experience in microservices or modern web services architectures for back end web development
· Demonstrable understanding of modern distributed systems architectures
· Good working knowledge of Git Source Control, Docker, Kubernetes
· Good understanding of CI/CD techniques
· Experience with SQL & NoSQL databases (PostgreSQL, ElasticSearch, Neo4j)
· Practical experience with Agile development and Peer Code Review
· Experience of REST APIs
· Experience with GraphQL
· Be a technical lead on our EngineTM system
· Provide Technical leadership, mentoring and career growth to a growing team
· Design and build the ecosystem of microservices and associated PAAS/IAAS systems into an end to end delivery system
· Capable of working with both structured and unstructured data sets (NLP)
· Assemble large, complex data sets that meet functional / non-functional business requirements.
· Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
· Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
· Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
· Keep our data separated and secure