scala-logging不会创建日志文件

我正在尝试使用scala-logging从Linux上运行的Scala / Spark项目进行日志,并通过spark-submit在Spark集群上启动。 输出到stdout的日志记录效果很好,但是根本没有创建日志文件。

build.sbt文件中的依赖性如下所示:

libraryDependencies ++= Seq(
  "org.apache.spark"           %% "spark-core" % sparkVersion % "provided","org.apache.spark"           %% "spark-sql"  % sparkVersion % "provided","com.google.guava"            % "guava"      % "28.1-jre"
  "org.scala-lang.modules"      %% "scala-parser-combinators"   % "1.1.2","org.scalatest"                %% "scalatest"                    % "3.1.0-RC3"   % "test","com.ibm.icu"                   % "icu4j"                         % "64.2","com.typesafe.scala-logging" %% "scala-logging"                % "3.9.2","ch.qos.logback"                % "logback-classic"             % "1.2.3","com.typesafe"                  % "config"                        % "1.3.4","mysql"                          % "mysql-connector-java"        % "8.0.17","com.zaxxer"                    % "HikariCP"                      % "3.4.0"
)

位于资源目​​录中的logback.xml文件如下所示:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>

    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <appender name="FILE" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>logs/myProjectName.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <!-- Daily rollover -->
            <fileNamePattern>logs/myProjectName.%d{yyyy-MM-dd}.log</fileNamePattern>

            <!-- Keep 10 days' worth of history -->
            <maxHistory>10</maxHistory>
        </rollingPolicy>

        <encoder>
            <pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
        </encoder>
    </appender>

    <!-- Configure so that it outputs to both console and log file -->
    <root level="DEBUG">
        <appender-ref ref="FILE"/>
        <appender-ref ref="STDOUT"/>
    </root>
</configuration>

这是我在scala代码中创建记录器的方式:

import com.typesafe.scalalogging.Logger

val LOGGER: Logger = Logger[MyClassname]

这是我通过spark-submit开始处理运行的方式:

/opt/Spark/spark/bin/spark-submit --name MyProject --master 'spark://myHost.com:7077' --class com.mycompany.mypack.MyClassname --driver-memory 15G --num-executors 2 --executor-cores 6 --executor-memory 30G --conf spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version=2 /opt/myFolder/myProject-assembly.jar

我想念什么?

谢谢

mmmnnnwo 回答:scala-logging不会创建日志文件

暂时没有好的解决方案,如果你有好的解决方案,请发邮件至:iooj@foxmail.com
本文链接:https://www.f2er.com/3145467.html

大家都在问