Need advice about which tool to choose?Ask the StackShare community!


+ 1
Apache Drill

+ 1
Add tool

Apache Drill vs Dremio: What are the differences?


Apache Drill and Dremio are both powerful data exploration and analysis tools that work with a variety of data sources. They provide means to achieve self-service data analytics, but there are key differences between the two platforms.

  1. Data Virtualization Approach: Apache Drill is based on the concept of data virtualization, which enables users to query and analyze data stored in various sources with a unified interface. It allows users to perform complex queries on different types of data without the need for data integration or transformation. On the other hand, Dremio takes a hybrid approach, combining aspects of data virtualization and data acceleration. It caches and accelerates data from different sources to provide faster query performance, while also offering virtualization capabilities.

  2. Architecture and Deployment: Apache Drill follows a distributed architecture, where the query execution is distributed across multiple nodes in a cluster. It can be deployed on premises or in the cloud. Dremio, on the other hand, is designed as a single coherent system, making it easier to deploy and manage. It can be deployed on a cluster of machines or run as a single node, depending on the scale of usage.

  3. Enterprise-Grade Features: Dremio offers a range of enterprise-grade features that are not available in Apache Drill. These include advanced security features like LDAP and Active Directory integration, column-level and row-level access controls, and encryption at rest. Dremio also provides features like job scheduling, workload management, and data lineage tracking that are not present in Apache Drill.

  4. Data Reflections: Dremio introduces the concept of data reflections, which are materialized views that store pre-aggregated or pre-joined data from the underlying sources. These reflections can significantly improve query performance by reducing the amount of data that needs to be scanned. Apache Drill does not provide a similar feature out-of-the-box but can achieve similar optimizations using techniques like query planning and optimization.

  5. User Experience and SQL Capabilities: Dremio focuses on providing a user-friendly experience with a web-based interface for data exploration and visualization. It offers a rich set of SQL capabilities including window functions, derived tables, and support for various data types. Apache Drill also provides SQL capabilities but may have a steeper learning curve compared to Dremio.

  6. Community and Support: Apache Drill is an open-source project supported by a diverse community of developers and users. While it offers community support, dedicated commercial support is also available. Dremio, on the other hand, is an enterprise software platform with dedicated commercial support and additional enterprise-oriented features. It also has an active community and offers a free community edition for non-production use.

In summary, Apache Drill and Dremio are both powerful data exploration and analysis tools but differ in their approach to data virtualization, architecture, enterprise-grade features, the concept of data reflections, user experience, and community/support offerings.

Advice on Dremio and Apache Drill

We need to perform ETL from several databases into a data warehouse or data lake. We want to

  • keep raw and transformed data available to users to draft their own queries efficiently
  • give users the ability to give custom permissions and SSO
  • move between open-source on-premises development and cloud-based production environments

We want to use inexpensive Amazon EC2 instances only on medium-sized data set 16GB to 32GB feeding into Tableau Server or PowerBI for reporting and data analysis purposes.

See more
Replies (3)
John Nguyen
AirflowAirflowAWS LambdaAWS Lambda

You could also use AWS Lambda and use Cloudwatch event schedule if you know when the function should be triggered. The benefit is that you could use any language and use the respective database client.

But if you orchestrate ETLs then it makes sense to use Apache Airflow. This requires Python knowledge.

See more

Though we have always built something custom, Apache airflow ( stood out as a key contender/alternative when it comes to open sources. On the commercial offering, Amazon Redshift combined with Amazon Kinesis (for complex manipulations) is great for BI, though Redshift as such is expensive.

See more

You may want to look into a Data Virtualization product called Conduit. It connects to disparate data sources in AWS, on prem, Azure, GCP, and exposes them as a single unified Spark SQL view to PowerBI (direct query) or Tableau. Allows auto query and caching policies to enhance query speeds and experience. Has a GPU query engine and optimized Spark for fallback. Can be deployed on your AWS VM or on prem, scales up and out. Sounds like the ideal solution to your needs.

See more
karunakaran karthikeyan
Needs advice

I am trying to build a data lake by pulling data from multiple data sources ( custom-built tools, excel files, CSV files, etc) and use the data lake to generate dashboards.

My question is which is the best tool to do the following:

  1. Create pipelines to ingest the data from multiple sources into the data lake
  2. Help me in aggregating and filtering data available in the data lake.
  3. Create new reports by combining different data elements from the data lake.

I need to use only open-source tools for this activity.

I appreciate your valuable inputs and suggestions. Thanks in Advance.

See more
Replies (1)
Rod Beecham
Partnering Lead at Zetaris · | 3 upvotes · 64.8K views

Hi Karunakaran. I obviously have an interest here, as I work for the company, but the problem you are describing is one that Zetaris can solve. Talend is a good ETL product, and Dremio is a good data virtualization product, but the problem you are describing best fits a tool that can combine the five styles of data integration (bulk/batch data movement, data replication/data synchronization, message-oriented movement of data, data virtualization, and stream data integration). I may be wrong, but Zetaris is, to the best of my knowledge, the only product in the world that can do this. Zetaris is not a dashboarding tool - you would need to combine us with Tableau or Qlik or PowerBI (or whatever) - but Zetaris can consolidate data from any source and any location (structured, unstructured, on-prem or in the cloud) in real time to allow clients a consolidated view of whatever they want whenever they want it. Please take a look at for more information. I don't want to do a "hard sell", here, so I'll say no more! Warmest regards, Rod Beecham.

See more
Get Advice from developers at your company using StackShare Enterprise. Sign up for StackShare Enterprise.
Learn More
Pros of Dremio
Pros of Apache Drill
  • 3
    Nice GUI to enable more people to work with Data
  • 2
    Connect NoSQL databases with RDBMS
  • 2
    Easier to Deploy
  • 1
  • 4
    NoSQL and Hadoop
  • 3
  • 3
    Lightning speed and simplicity in face of data jungle
  • 2
    Well documented for fast install
  • 1
    SQL interface to multiple datasources
  • 1
    Nested Data support
  • 1
    Read Structured and unstructured data
  • 1
    V1.10 released -

Sign up to add or upvote prosMake informed product decisions

Cons of Dremio
Cons of Apache Drill
  • 1
    Works only on Iceberg structured data
    Be the first to leave a con

    Sign up to add or upvote consMake informed product decisions

    What is Dremio?

    Dremio—the data lake engine, operationalizes your data lake storage and speeds your analytics processes with a high-performance and high-efficiency query engine while also democratizing data access for data scientists and analysts.

    What is Apache Drill?

    Apache Drill is a distributed MPP query layer that supports SQL and alternative query languages against NoSQL and Hadoop data storage systems. It was inspired in part by Google's Dremel.

    Need advice about which tool to choose?Ask the StackShare community!

    Jobs that mention Dremio and Apache Drill as a desired skillset
    What companies use Dremio?
    What companies use Apache Drill?
    See which teams inside your own company are using Dremio or Apache Drill.
    Sign up for StackShare EnterpriseLearn More

    Sign up to get full access to all the companiesMake informed product decisions

    What tools integrate with Dremio?
    What tools integrate with Apache Drill?

    Sign up to get full access to all the tool integrationsMake informed product decisions

    What are some alternatives to Dremio and Apache Drill?
    Distributed SQL Query Engine for Big Data
    It is the leader in data virtualization providing data access, data governance and data delivery capabilities across the broadest range of enterprise, cloud, big data, and unstructured data sources without moving the data from their original repositories.
    Its Virtual Data Warehouse delivers performance, security and agility to exceed the demands of modern-day operational analytics.
    Snowflake eliminates the administration and management demands of traditional data warehouses and big data platforms. Snowflake is a true data warehouse as a service running on Amazon Web Services (AWS)—no infrastructure to manage and no knobs to turn.
    Segment is a single hub for customer data. Collect your data in one place, then send it to more than 100 third-party tools, internal systems, or Amazon Redshift with the flip of a switch.
    See all alternatives