Usage of configuration files in spark project

problem description

sparksql project, the sql script is placed in the resource/sql/ file below (different businesses, there are a lot of scripts); Local write code to load the sql script using this.getClass.getResource (). GetPath method, get the path to read the file content and assemble it into a sql string for execution.
now you need to submit the typed jar to spark-submit. As a result, the file path cannot be found all the time (printing the log shows that the path does not exist at all; but the problem is that the packaged jar package is decompressed, and the configuration file is in it)

the environmental background of the problems and what methods you have tried

the way I have tried is to nearly disperse the packaging and configuration files, specifying the location of the configuration files in the linux to read and load. But I always feel that this approach is a bit low. I wonder if netizens have encountered this problem again. Just give me some ideas. Thank you.


you can't measure if you don't have an environment on hand.
I guess the path that this.getClass.getResource (). GetPath got is picked up on hdfs in the environment of spark.
all the file reads under my classpath are read by the class library com.typesafe.config, and there are no problems both locally and online.

Menu