Our stack is divided into five pillars and within each pillar we specialize in a few technologies that can fit globally into any of your use cases. The exact selection of the best stack for your company is determined by multiple factors:
- Skills of your data team
- Big Data triangle (velocity, variety, volume)
- Predisposition to open-source technologies
We ingest data from your transactional databases and applications in a seamless manner using point-to-point and no-code technologies. Our preferred tools are:
Backbone pillar where the data is transformed from the transactional systems into a new data model that will feed several downstream applications. Having those data transformations orchestrated, operated at a fast pace and with visibility require mastering a combination of best-in-class tools.
Getting value out of the data is the ultimate goal. Pillar 1 and 2 are the foundations to activate the third pillar to flourish. We focus on four forms of distribution:
Data quality, observability, security and privacy are becoming “must have” for data ecosystem due to stricter regulations and business needs. Data governance tools are booming and this can lead to confusion when it comes to selecting a tool. Our team has experience with the mainstream enterprise-grade solutions and also with major open-source tools. Our approach in terms of data governance is first of all a study of your data-ecosystem to determine the best selection of tools. Below is a non-exhaustive list of top-notch solutions to cover the spectrum of data governance needs:
The glue that puts it all together between the five pillars. None of the previous pillars can last on its own if automation is not present all along the way. We use the following tools to automate this process:
We are convinced through many years of experience in the different mainstream clouds that Google Cloud is one step ahead as far as data analytics and Machine Learning are concerned. Each of our engineers is fully proficient in Google Cloud with at least five Google Cloud certifications. While we are experts in the Analytics side, other areas like networking and infrastructure have no secrets for us and won’t be lasting blockers while implementing your analytical solutions.
In order to realize your data journey efficiently, we host most of the open-source technologies we use on GKE (i.e. Google Kubernetes Engine) and this allows us to leverage scalability, high availability and fully secured features.
Transfer knowledge is a fundamental part in each of our projects. We are not just delivering a “one shot” project but are committed to make our solutions last and evolve in time through your team. We do this by involving and training your data team in each step of the implementation. At the end of the project, we then accompany your team with a mix of maintenance and custom training. The journey only ends once you can fly on your own with the modern data stack in place.
We do not sell a product, we sell a data journey with Astrafy pillars and values.
Our coding rules
- Thorough Design before any implementation
- Documentation is as important as code
- Open-source over license
- Easy to maintain over fast and complex
- Python, Golang and Rust