SB5.1 - OCI - Deploy/KnowledgePlatform/FlinkJobs - S3 storage #327
-
CSP: OCI Could you please share the steps to configure Deploy/KnowledgePlatform/FlinkJobs to use s3 as a storage option. I tried to use s3 compliant api, but jobmanager is failing to start
|
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 1 reply
-
any update on this |
Beta Was this translation helpful? Give feedback.
-
any update on this |
Beta Was this translation helpful? Give feedback.
-
@AmiableAnil @reshmi-nair Can you please provide your inputs on this thread? |
Beta Was this translation helpful? Give feedback.
-
i builded the cloudstorage sdk for using OCI object storage https://gist.github.com/ddevadat/7a3938833b2bf2908c6b0773bca04e93 the key things things changed i tried to start kp flink jobs, questionset-publish is working fine and its creating checkpoint directories in oci object storage bucket However other jobmanagers are failing with the below error ` java.nio.file.AccessDeniedException: flink-check-points-store: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by SimpleAWSCredentialsProvider EnvironmentVariableCredentialsProvider InstanceProfileCredentialsProvider : com.amazonaws.SdkClientException: The requested metadata is not found at http://169.254.169.254/latest/meta-data/iam/security-credentials/ |
Beta Was this translation helpful? Give feedback.
-
when using hadoop-aws
2)After this its able to create checkpoint buckets, but not able to interact with the bucket becuase it checks for if the bucket exists. This call For this we need to use hadoop 3.3.1 or higher , because hadoop introduced this feature to disable the bucket verification check So now we know that minimum hadoop version required is 3.3.1 But on using 3.3.1, we get errors like
|
Beta Was this translation helpful? Give feedback.
-
Eventually solved this by using flink-s3-fs-presto For this added the dependency while building the cloud-storage-sdk package
Then in the flink helm chart deployment...added the presto variable for passing s3 credentials like below
|
Beta Was this translation helpful? Give feedback.
Eventually solved this by using flink-s3-fs-presto
For this added the dependency while building the cloud-storage-sdk package
Then in the flink helm chart deployment...added the presto variable for passing s3 credentials like below