* java scala spark hadoop hive * flink * tmp copy * bigdata toolkit * change image tag * change build script * typo * add log4j2.xml * change https to http * fix * exchange location of spark and flink * add maven proxy * change proxy name * update settings * add scala version arg * add submit.sh * delete test script * change home and add dependencies * add custom images * fix * add sgx related files * path change * arg move * move unused env and directory * remove scala * remove settings * upgrade spark * delete settings * delete work directory * modify path * remove java * remove java in first stage * readd jdk in first stage * fix * build base image script * add readme * remove flink * fix * remove unused env * CI/CD * trigger test * Trigger test * fix proxy * enable non-privilege * remove directories in level 1 * change resuorces * remove test code * change resources limits  | 
			||
|---|---|---|
| .. | ||
| manually_build.yml | ||