Skip to content

Commit

Permalink
[MINOR] fix(dos): Fix minor error about spelling and web link (#6290)
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

Change `Hadop` -> `Hadoop` and fix a link error.

### Why are the changes needed?

It's spelling mistakes.

### Does this PR introduce _any_ user-facing change?

N/A

### How was this patch tested?

N/A
  • Loading branch information
yuqi1129 authored Jan 16, 2025
1 parent 24952c9 commit 393609b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion docs/hadoop-catalog-with-s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -446,7 +446,7 @@ In order to access fileset with S3 using the GVFS Python client, apart from [bas
| `s3_secret_access_key` | The secret key of the AWS S3. | (none) | Yes | 0.7.0-incubating |

:::note
- `s3_endpoint` is an optional configuration for GVFS **Python** client but a required configuration for GVFS **Java** client to access Hadop with AWS S3, and it is required for other S3-compatible storage services like MinIO.
- `s3_endpoint` is an optional configuration for GVFS **Python** client but a required configuration for GVFS **Java** client to access Hadoop with AWS S3, and it is required for other S3-compatible storage services like MinIO.
- If the catalog has enabled [credential vending](security/credential-vending.md), the properties above can be omitted.
:::

Expand Down
2 changes: 1 addition & 1 deletion docs/hadoop-catalog.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ license: "This software is licensed under the Apache License version 2."
## Introduction

Hadoop catalog is a fileset catalog that using Hadoop Compatible File System (HCFS) to manage
the storage location of the fileset. Currently, it supports the local filesystem and HDFS. Since 0.7.0-incubating, Gravitino supports [S3](hadoop-catalog-with-S3.md), [GCS](hadoop-catalog-with-gcs.md), [OSS](hadoop-catalog-with-oss.md) and [Azure Blob Storage](hadoop-catalog-with-adls.md) through Hadoop catalog.
the storage location of the fileset. Currently, it supports the local filesystem and HDFS. Since 0.7.0-incubating, Gravitino supports [S3](hadoop-catalog-with-s3.md), [GCS](hadoop-catalog-with-gcs.md), [OSS](hadoop-catalog-with-oss.md) and [Azure Blob Storage](hadoop-catalog-with-adls.md) through Hadoop catalog.

The rest of this document will use HDFS or local file as an example to illustrate how to use the Hadoop catalog. For S3, GCS, OSS and Azure Blob Storage, the configuration is similar to HDFS, please refer to the corresponding document for more details.

Expand Down

0 comments on commit 393609b

Please sign in to comment.