Skip to content

Releases: ayuhito/modern-tar

v0.7.1

14 Nov 10:03
6af3441

Choose a tag to compare

Overview

Small bug fix related to cancelling streams in the web decoder and clarifying documentation related to draining streams 🏑

  • fix(web): centralize stream cancellation logic by @ayuhito in #98

Full Changelog: v0.7.0...v0.7.1

v0.7.0

13 Nov 06:14
877d26c

Choose a tag to compare

Overview

⚡ This release makes modern-tar comparable or faster than other existing libraries in the ecosystem with a new architecture and various low-level optimizations!

Breaking ❗

  • Removed streamTimeout option from UnpackOptions.

What's Changed

  • perf: use pull architecture and leaner streams by @ayuhito in #95
  • fix(web): decoder should not wait for last chunk by @ayuhito in #96

Full Changelog: v0.6.1...v0.7.0

v0.6.1

24 Oct 03:13
a20fa3c

Choose a tag to compare

Overview

Resolves a bug when unpacking and repacking directories using the Web API helpers received mismatched headers 🐛

  • fix(pack): accept parsed tar entry with data by @ayuhito in #93

Full Changelog: v0.6.0...v0.6.1

v0.6.0

23 Oct 06:39
7dbb7e4

Choose a tag to compare

Overview

This is a breaking change for the Web API packTar function where the returned data property can now either be undefined or a Uint8Array.

This is needed since we need to be able to differentiate bodyless entries like directory which have no data, or empty files which instead needs to be represented as Uint8Array(0) 🪴

/**
 * Represents an extracted entry with fully buffered content.
 *
 * For bodyless entries (directories, symlinks, hardlinks), `data` will be `undefined`.
 * For files (including empty files), `data` will be a `Uint8Array`.
 */
export interface ParsedTarEntryWithData {
	header: TarHeader;
	- data: Uint8Array;
	+ data?: Uint8Array;
}
  • fix(web): do not return data on bodyless entries for pack by @ayuhito in #92

Full Changelog: v0.5.5...v0.6.0

v0.5.5

22 Oct 06:21
a3468f7

Choose a tag to compare

Overview

Fixes an edge case related to the map helper interacting with empty paths when unpacking tar files 🐍

  • fix(options): handle invalid entries after map by @ayuhito in #90

Full Changelog: v0.5.4...v0.5.5

v0.5.4

16 Oct 01:58
9c4d224

Choose a tag to compare

Overview

This adds support for older tsconfig.json setups that may support newer Node versions, but still are configured to use the old node10 package resolution mechanism.

  • fix(exports): use types versions for old node bundlers by @ayuhito in #88

Full Changelog: v0.5.3...v0.5.4

v0.5.3

14 Oct 16:14
c9d6302

Choose a tag to compare

Overview

To match existing tar implementations in Node, we should handle absolute paths gracefully rather than throwing on every case ✂️

  • fix(unpack): strip absolute paths instead of throwing by @ayuhito in #86

Full Changelog: v0.5.2...v0.5.3

v0.5.2

14 Oct 03:00
07d6423

Choose a tag to compare

Overview

Includes performance improvements for packing, bug fixes and hardens Windows environments for unpacking ☘️

  • fix(pack): apply invalid directory mode on stream source by @ayuhito in #80
  • perf(pack): use 512KB read buffer on large files by @ayuhito in #81
  • fix(unpack): sanitize windows paths for traversal by @ayuhito in #82
  • perf(path): simplify unicode cache by @ayuhito in #83
  • fix(unpack): handle self referential hardlinks by @ayuhito in #84
  • fix(unpack): strip trailing slashes by @ayuhito in #85

Full Changelog: v0.5.1...v0.5.2

v0.5.1

12 Oct 15:01
cf6182c

Choose a tag to compare

Overview

packTar now supports overriding any metadata when using sources 🧷

import { packTar } from 'modern-tar/fs';

const sources = [
  {
    type: "file",
    source: "./package.json",
    target: "project/package.json",
    mtime: new Date("2024-01-01T00:00:00Z"),
    uid: 1001,
    gid: 1002,
    uname: "builder",
    gname: "ci",
    mode: 0o644
  }
];

const tarStream = packTar(sources);
  • feat(pack): allow overriding more metadata on tar sources by @ayuhito in #78

Full Changelog: v0.5.0...v0.5.1

v0.5.0

12 Oct 09:04
2ab8c5e

Choose a tag to compare

Overview

This adds a new StreamSource type for packTar to use. It can accept Readable or ReadableStream types 🗃️

import { packTar, type TarSource } from 'modern-tar/fs';
import { createWriteStream } from 'node:fs';
import { pipeline } from 'node:stream/promises';

// Pack multiple sources
const sources: TarSource[] = [
  { type: 'file', source: './package.json', target: 'project/package.json' },
  { type: 'directory', source: './src', target: 'project/src' },
  { type: 'content', content: 'Hello World!', target: 'project/hello.txt' },
  { type: 'content', content: '#!/bin/bash\necho "Executable"', target: 'bin/script.sh', mode: 0o755 },
  { type: 'stream', content: createReadStream('./large-file.bin'), target: 'project/data.bin', size: 1048576 },
  { type: 'stream', content: fetch('/api/data').then(r => r.body!), target: 'project/remote.json', size: 2048 }
];

const archiveStream = packTar(sources);
await pipeline(archiveStream, createWriteStream('project.tar'));

Note this is a breaking change for ContentSource (type: 'content') as we used accept ReadableStream there. This has been moved to StreamSource as it is inherently unsafe to pack tar files from streams with unknown sizes due to OOM or DoS resource exhaustion attacks. tar files must know the size in advance to write the relevant headers and this library should not implicitly buffer such sources.

You must either:

  • Acquire the length in advance (such as using fs.stat/fs.lstat or using the Content-Length header). This is already handled for filesystems if you instead pass in the type: file with the filepath.
  • Or buffer it (e.g. import { buffer } from "node:stream/consumers") yourself.

  • feat(pack): accept node readable by @ayuhito in #75
  • feat(pack): new stream source for fs packer by @ayuhito in #76
  • fix(unpack): handle base 256 bigint numeric headers by @ayuhito in #77

Full Changelog: v0.4.2...v0.5.0