开发者问题收集

Minikube Docker 脚本中的权限被拒绝

2022-10-07
445

我是 Minikube 和 Docker 的新手。我有一个 Minikube 设置,其中有三个 apache spark pod。 一个 spark master 和两个 spark worker。我的 spark master 的 docker 文件如下,

# base image
FROM openjdk:11

# define spark and hadoop versions
ENV SPARK_VERSION=3.2.0
ENV HADOOP_VERSION=3.3.1

# download and install hadoop
RUN mkdir -p /opt && \
    cd /opt && \
    curl http://archive.apache.org/dist/hadoop/common/hadoop-${HADOOP_VERSION}/hadoop-${HADOOP_VERSION}.tar.gz | \
        tar -zx hadoop-${HADOOP_VERSION}/lib/native && \
    ln -s hadoop-${HADOOP_VERSION} hadoop && \
    echo Hadoop ${HADOOP_VERSION} native libraries installed in /opt/hadoop/lib/native

# download and install spark
RUN mkdir -p /opt && \
    cd /opt && \
    curl http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop2.7.tgz | \
        tar -zx && \
    ln -s spark-${SPARK_VERSION}-bin-hadoop2.7 spark && \
    echo Spark ${SPARK_VERSION} installed in /opt

# add scripts and update spark default config
ADD common.sh spark-master spark-worker /
ADD spark-defaults.conf /opt/spark/conf/spark-defaults.conf
ENV PATH $PATH:/opt/spark/bin

当我部署 pod 时,我收到一个错误,

Events:

  Type     Reason     Age                   From               Message

  ----     ------     ----                  ----               -------

  Warning  Failed     25m (x5 over 26m)     kubelet            Error: failed to start container "spark-master": Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/spark-master": permission denied: unknown

脚本 spark-master 的内容,

#!/bin/bash

. /common.sh

echo "$(hostname -i) spark-master" >> /etc/hosts

/opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --ip spark-master --port 7077 --webui-port 8080

请帮助解决这个问题。 我的 Docker 版本是: Docker 版本 20.10.18,构建 b40c2f6

2个回答

在 Dockerfile 中,我注释掉了以下行

#ADD common.sh spark-master spark-worker /

用以下几行替换了该行,解决了权限错误

COPY common.sh spark-master spark-worker  /
RUN chmod +x /common.sh /spark-master /spark-worker
muhammad800804
2022-10-07

脚本spark-master内容如下:

#!/bin/bash

. /common.sh

echo "$(hostname -i) spark-master" >> /etc/hosts

/opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --ip spark-master --p spark-master-ui-port >> /var/log/spark-master.log 2>&1
Rahul Sherkar
2022-10-07