I have the log4j2.properties file made available as part of the classpath via container image. but its not getting applied. the log level in that log4j2 is set to debug, but i never able to see debug log on serverless batch job. ( on gcloud console or via command : “gcloud dataproc batches wait batch-id”
Here are some common troubleshooting steps that you can take to diagnose and fix the issue:
Verify the File Location: Ensure that the log4j2.properties file is indeed in the classpath of your application. You can check this by logging the classpath at runtime and verifying that the path to your log4j2.properties file is included.
Check the File Format: Make sure that the log4j2.properties file is correctly formatted and does not contain any syntax errors. You can validate it using online tools or by checking against the log4j2 documentation.
Use the Correct Property Key: In some cases, you might need to use the property key log4j.configurationFile instead of log4j.configurationFile. Try changing your configuration to:
Check for Overriding Configurations: There might be other configurations that are overriding your log4j2 settings. Look for any other log4j2 configuration files or settings that might be conflicting with your desired configuration.
Enable Log4j2 Debugging: You can enable internal log4j2 debugging by adding the following to your Java options:
everything is good, name, format, path and it exists as well in image.
upon debugging furhter ( via -Dlog4j2.debug=true as suggested ), found following info.
serverless batch job somehow has /etc/spark/conf in classpath is before the path where my log4j2.properties file resides (set via SPARK_EXTRA_CLASSPATH ) so using -Dlog4j.configurationFile=classpath:log4j2.properties always picks up log4j2 file from /etc/spark/conf.
to fix this I had to mention explicit path like -Dlog4j.configurationFile=file:/path/conf/log4j2.properties and that works as expected.