github.com/treeverse/lakefs@v1.24.1-0.20240520134607-95648127bfb0/README.md (about)

     1  <p align="center">
     2    <img src="docs/assets/img/logo_large.png"/>
     3  </p>
     4  <p align="center">
     5  	<a href="https://raw.githubusercontent.com/treeverse/lakeFS/master/LICENSE" >
     6  		<img src="https://img.shields.io/badge/License-Apache%202.0-blue.svg" alt="Apache License" /></a>
     7  	<a href="https://github.com/treeverse/lakeFS/actions/workflows/test.yaml?query=branch%3Amaster">
     8  		<img src="https://github.com/treeverse/lakeFS/workflows/Test/badge.svg?branch=master" alt="Go tests status" /></a>
     9  	<a href="https://github.com/treeverse/lakeFS/actions/workflows/node.yaml?query=branch%3Amaster" >
    10  		<img src="https://github.com/treeverse/lakeFS/workflows/Node/badge.svg?branch=master" alt="Node tests status" /></a>
    11  	<a href="https://github.com/treeverse/lakeFS/actions/workflows/esti.yaml?query=branch%3Amaster">
    12  		<img src="https://github.com/treeverse/lakeFS/workflows/Esti/badge.svg?branch=master" alt="Integration tests status" /></a>
    13  	<a href="https://github.com/treeverse/lakeFS/actions/workflows/docs-pr.yaml">
    14  		<img src="https://github.com/treeverse/lakeFS/actions/workflows/docs-pr.yaml/badge.svg" alt="Docs Preview & Link Check status" /></a>
    15  	<a href="https://artifacthub.io/packages/search?repo=lakefs">
    16  		<img src="https://img.shields.io/endpoint?url=https://artifacthub.io/badge/repository/lakefs" alt="Artifact HUB" /></a>
    17  	<a href="CODE_OF_CONDUCT.md">
    18  		<img src="https://img.shields.io/badge/Contributor%20Covenant-v2.0%20adopted-ff69b4.svg" alt="code of conduct"></a>
    19  </p>
    20  
    21  ## lakeFS is Data Version Control (Git for Data)
    22  
    23  lakeFS is an open-source tool that transforms your object storage into a Git-like repository. It enables you to manage your data lake the way you manage your code.
    24  
    25  With lakeFS you can build repeatable, atomic, and versioned data lake operations - from complex ETL jobs to data science and analytics.
    26  
    27  lakeFS supports AWS S3, Azure Blob Storage, and Google Cloud Storage as its underlying storage service. It is API compatible with S3 and works seamlessly with all modern data frameworks such as Spark, Hive, AWS Athena, DuckDB, and Presto.
    28  
    29  For more information, see the [documentation](https://docs.lakefs.io).
    30  
    31  ## Getting Started
    32  
    33  You can spin up a standalone sandbox instance of lakeFS using Docker:
    34  
    35  ```bash
    36  docker run --pull always \
    37  		   --name lakefs \
    38  		   -p 8000:8000 \
    39  		   treeverse/lakefs:latest \
    40  		   run --quickstart
    41  ```
    42  
    43  Once you've got lakeFS running, open [http://127.0.0.1:8000/](http://127.0.0.1:8000/) in your web browser.
    44  
    45  ### Quickstart
    46  
    47  **👉🏻 For a hands-on walk through of the core functionality in lakeFS head over to [the quickstart](https://docs.lakefs.io/quickstart/) to jump right in!**
    48  
    49  Make sure to also have a look at the [lakeFS samples](https://github.com/treeverse/lakeFS-samples). These are a rich resource of examples of end-to-end applications that you can build with lakeFS.
    50  
    51  ## Why Do I Need lakeFS?
    52  
    53  ### ETL Testing with Isolated Dev/Test Environment
    54  
    55  When working with a data lake, it’s useful to have replicas of your production environment. These replicas allow you to test these ETLs and understand changes to your data without impacting downstream data consumers.
    56  
    57  Running ETL and transformation jobs directly in production without proper ETL Testing is a guaranteed way to have data issues flow into dashboards, ML models, and other consumers sooner or later. The most common approach to avoid making changes directly in production is to create and maintain multiple data environments and perform ETL testing on them. Dev environment to develop the data pipelines and test environment where pipeline changes are tested before pushing it to production. With lakeFS you can create branches, and get a copy of the full production data, without copying anything. This enables a faster and easier process of ETL testing.
    58  
    59  ### Reproducibility
    60  
    61  Data changes frequently. This makes the task of keeping track of its exact state over time difficult. Oftentimes, people maintain only one state of their data––its current state.
    62  
    63  This has a negative impact on the work, as it becomes hard to:
    64  * Debug a data issue.
    65  * Validate machine learning training accuracy (re-running a model over different data gives different results).
    66  Comply with data audits.
    67  
    68  In comparison, lakeFS exposes a Git-like interface to data that allows keeping track of more than just the current state of data. This makes reproducing its state at any point in time straightforward.
    69  
    70  ### CI/CD for Data
    71  
    72  Data pipelines feed processed data from data lakes to downstream consumers like business dashboards and machine learning models. As more and more organizations rely on data to enable business critical decisions, data reliability and trust are of paramount concern. Thus, it’s important to ensure that production data adheres to the data governance policies of businesses. These data governance requirements can be as simple as a file format validation, schema check, or an exhaustive PII(Personally Identifiable Information) data removal from all of organization’s data.
    73  
    74  Thus, to ensure the quality and reliability at each stage of the data lifecycle, data quality gates need to be implemented. That is, we need to run Continuous Integration(CI) tests on the data, and only if data governance requirements are met can the data can be promoted to production for business use.
    75  
    76  Everytime there is an update to production data, the best practice would be to run CI tests and then promote(deploy) the data to production. With lakeFS you can create hooks that make sure that only data that passed these tests will become part of production.
    77  
    78  ### Rollback
    79  A rollback operation is used to to fix critical data errors immediately.
    80  
    81  What is a critical data error? Think of a situation where erroneous or misformatted data causes a signficant issue with an important service or function. In such situations, the first thing to do is stop the bleeding.
    82  
    83  Rolling back returns data to a state in the past, before the error was present. You might not be showing all the latest data after a rollback, but at least you aren’t showing incorrect data or raising errors. Since lakeFS provides versions of the data without making copies of the data, you can time travel between versions and roll back to the version of the data before the error was presented.
    84  
    85  ## Community
    86  
    87  Stay up to date and get lakeFS support via:
    88  
    89  - Share your lakeFS experience and get support on [our Slack](https://go.lakefs.io/JoinSlack).
    90  - Follow us and join the conversation on [Twitter](https://twitter.com/lakeFS) and [Mastodon](https://data-folks.masto.host/@lakeFS).
    91  - Learn from video tutorials on [our YouTube channel](https://lakefs.io/youtube).
    92  - Read more on data versioning and other data lake best practices in [our blog](https://lakefs.io/blog/data-version-control/).
    93  - Feel free to [contact us](https://lakefs.io/contact-us/) about anything else.
    94  
    95  ## More information
    96  
    97  - Read the [documentation](https://docs.lakefs.io).
    98  - See the [contributing guide](https://docs.lakefs.io/contributing).
    99  - Take a look at our [roadmap](https://docs.lakefs.io/understand/roadmap.html) to peek into the future of lakeFS.
   100  
   101  ## Licensing
   102  
   103  lakeFS is completely free and open-source and licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0).
   104  
   105  ## Who Uses lakeFS?
   106  
   107  lakeFS is used by numerous companies, including those below. _If you use lakeFS and would like to be included here please open a PR._
   108  
   109  * AirAsia
   110  * APEX Global
   111  * AppsFlyer
   112  * Auburn University
   113  * BAE Systems
   114  * Bureau of Labor Statistics
   115  * Cambridge Consultants
   116  * Connor, Clark & Lunn Financial Group
   117  * Context Labs Bv
   118  * Daimler Truck
   119  * Enigma
   120  * EPCOR
   121  * Ford Motor Company
   122  * Generali
   123  * Giesecke+Devrient
   124  * greehill
   125  * Karius
   126  * Luxonis
   127  * Mixpeek
   128  * Netflix
   129  * Paige
   130  * PETRONAS
   131  * Pollinate
   132  * Proton Technologies AG
   133  * ProtonMail
   134  * Renaissance Computing Institute
   135  * RHEA Group 
   136  * RMS
   137  * Sensum
   138  * Similarweb
   139  * State Street Global Advisors
   140  * Terramera
   141  * Tredence
   142  * Volvo Cars
   143  * Webiks
   144  * Windward
   145  * Woven by Toyota