Need advice about which tool to choose?Ask the StackShare community!
AtScale vs Dremio: What are the differences?
Introduction
AtScale and Dremio are two popular data virtualization platforms that provide organizations with the ability to access and analyze large datasets from various sources. While they both offer similar functionalities, there are key differences between the two. In this article, we will discuss the main differences between AtScale and Dremio.
Data Source Support: AtScale supports a wide range of data sources, including traditional relational databases, Hadoop-based platforms, cloud-based storage systems, and more. On the other hand, Dremio has extensive support for data sources, including traditional databases, cloud storage platforms, NoSQL databases, file systems, and more.
Data Virtualization Capabilities: AtScale primarily focuses on providing data virtualization capabilities for BI and analytics use cases. It offers features such as query optimization, caching, and semantic layer creation to enable faster data access and analysis. In contrast, Dremio is a full-fledged data lake engine that not only provides data virtualization but also advanced capabilities like data acceleration, data reflection, and data lineage.
Deployment Options: AtScale is typically deployed as an on-premises software solution or hosted on a private cloud infrastructure. It offers options to integrate with existing data platforms and tools. On the other hand, Dremio is a cloud-native platform that can be deployed on public, private, or hybrid clouds. It also provides a fully managed SaaS offering for organizations that prefer a hands-off approach.
Data Governance and Security: AtScale focuses on providing robust data governance and security features, including fine-grained access control, data masking, and data lineage tracking. It ensures compliance and data protection in regulated industries. Dremio also offers data governance capabilities, but with additional features like data cataloging, data classification, and policy-based access controls.
Performance Optimization: AtScale uses techniques like intelligent caching and query optimization to enhance query performance. It leverages its virtualization layer to translate BI tool queries into optimized queries for underlying data sources. Dremio, on the other hand, employs various optimization techniques like data reflection and distributed query execution to accelerate query performance and deliver real-time analytics capabilities.
Operating Models: AtScale follows a federated query model, where data stays in the source systems, and AtScale acts as a query federation layer. It provides a unified view of the data across the sources without physically moving or duplicating the data. Dremio, on the other hand, uses a data lake model, where data is consolidated in a central location and is made available for querying and analysis. It focuses on providing a self-service data platform for data exploration and analysis.
In Summary, AtScale and Dremio differ in terms of their data source support, data virtualization capabilities, deployment options, data governance and security features, performance optimization techniques, and operating models.
We need to perform ETL from several databases into a data warehouse or data lake. We want to
- keep raw and transformed data available to users to draft their own queries efficiently
- give users the ability to give custom permissions and SSO
- move between open-source on-premises development and cloud-based production environments
We want to use inexpensive Amazon EC2 instances only on medium-sized data set 16GB to 32GB feeding into Tableau Server or PowerBI for reporting and data analysis purposes.
You could also use AWS Lambda and use Cloudwatch event schedule if you know when the function should be triggered. The benefit is that you could use any language and use the respective database client.
But if you orchestrate ETLs then it makes sense to use Apache Airflow. This requires Python knowledge.
Though we have always built something custom, Apache airflow (https://airflow.apache.org/) stood out as a key contender/alternative when it comes to open sources. On the commercial offering, Amazon Redshift combined with Amazon Kinesis (for complex manipulations) is great for BI, though Redshift as such is expensive.
You may want to look into a Data Virtualization product called Conduit. It connects to disparate data sources in AWS, on prem, Azure, GCP, and exposes them as a single unified Spark SQL view to PowerBI (direct query) or Tableau. Allows auto query and caching policies to enhance query speeds and experience. Has a GPU query engine and optimized Spark for fallback. Can be deployed on your AWS VM or on prem, scales up and out. Sounds like the ideal solution to your needs.
I am trying to build a data lake by pulling data from multiple data sources ( custom-built tools, excel files, CSV files, etc) and use the data lake to generate dashboards.
My question is which is the best tool to do the following:
- Create pipelines to ingest the data from multiple sources into the data lake
- Help me in aggregating and filtering data available in the data lake.
- Create new reports by combining different data elements from the data lake.
I need to use only open-source tools for this activity.
I appreciate your valuable inputs and suggestions. Thanks in Advance.
Hi Karunakaran. I obviously have an interest here, as I work for the company, but the problem you are describing is one that Zetaris can solve. Talend is a good ETL product, and Dremio is a good data virtualization product, but the problem you are describing best fits a tool that can combine the five styles of data integration (bulk/batch data movement, data replication/data synchronization, message-oriented movement of data, data virtualization, and stream data integration). I may be wrong, but Zetaris is, to the best of my knowledge, the only product in the world that can do this. Zetaris is not a dashboarding tool - you would need to combine us with Tableau or Qlik or PowerBI (or whatever) - but Zetaris can consolidate data from any source and any location (structured, unstructured, on-prem or in the cloud) in real time to allow clients a consolidated view of whatever they want whenever they want it. Please take a look at www.zetaris.com for more information. I don't want to do a "hard sell", here, so I'll say no more! Warmest regards, Rod Beecham.
Pros of AtScale
Pros of Dremio
- Nice GUI to enable more people to work with Data3
- Connect NoSQL databases with RDBMS2
- Easier to Deploy2
- Free1
Sign up to add or upvote prosMake informed product decisions
Cons of AtScale
Cons of Dremio
- Works only on Iceberg structured data1