Skip to content
Snippets Groups Projects
Commit e74d1bde authored by Julian Andres Klode's avatar Julian Andres Klode
Browse files

Import Debian version 2.2.0

apt (2.2.0) unstable; urgency=medium

  * The "Happy soft freeze" release
  * Do not make DefaultRootSetFunc2 public symbol
  * kernels: Avoid std::regex for escaping '.' and '+'
  * symbols: Remove spurios package line, add kernel autoremoval helper

apt (2.1.20) unstable; urgency=medium

  * CI: Run test as user on i386
  * Fix test suite regression from StrToNum fixes. The tests started failing
    on 32-bit because the values were actually out of range, but we did not
    test errno before the last version, so it was not treated as an error.

apt (2.1.19) unstable; urgency=medium

  [ Helge Kreutzmann ]
  * German program translation update (Closes: #979848)

  [ Youfu Zhang ]
  * dpkg: fix passing readonly /dev/null fd as stdout/stderr

  [ Diederik de Haas ]
  * Fix apt-acquire-additional-files entity's location.

  [ Wolfgang Schweer ]
  * vendor: Adjust Debian -security codename

  [ Julian Andres Klode ]
  * Include all translations when building the cache (LP: #1907850)

  [ David Kalnischkies ]
  * Various patches uplifted from unfinished fuzzer branches
    - Use 500 MB memory limit for xz/lzma decoding
    - Guess compressor only if no AR nember with exact name exists
    - Free XXH3 state to avoid leak in cache hashing
    - Fail ConfigDir reading if directory listing failed
    - Retire and deprecate _strtabexpand
    - Fix incorrect base64 encoding due to int promotion
    - Don't parse \x and \0 past the end in DeEscapeString
    - Remove Word size limit from ParseQuote and CWord
    - Forbid negative values in unsigned StrToNum explicitly
    - Avoid overstepping bounds in config file parsing
    - Show 'Done' always for 'Building dependency tree'
    - Avoid undefined pointer arithmetic while growing mmap
    - Use error reporting instead of assert in rred patching
    - Replace PrintStatus with SendMessage usage
    - Ensure HTTP status code text has sensible content
    - Limit on first patch size only for server-merged patches
    - Use size of the old cache as APT::Cache-Start default
    - Remove spurious periods on progress strings in po/de.po

  [ Frans Spiesschaert ]
  * Dutch program translation update (Closes: #981885)
  * Dutch manpages translation update (Closes: #981883)

apt (2.1.18) unstable; urgency=high

  * pkgcachegen: Avoid write to old cache for Version::Extra (Closes: #980037)
  * Adjust apt-mark test for dpkg 1.20.7

apt (2.1.17) unstable; urgency=medium

  [ Américo Monteiro ]
  * Portuguese manpages translation update (Closes: #979725)

  [ Julian Andres Klode ]
  * kernels: Fix std::out_of_range if no kernels to protect
  * Call ischroot with -t

apt (2.1.16) unstable; urgency=medium

  [ Faidon Liambotis ]
  * Various fixes to http and connect method
    - basehttp: also consider Access when a Server's URI
    - connect: convert a C-style string to std::string
    - connect: use ServiceNameOrPort, not Port, as the cache key

  [ Julian Andres Klode ]
  * patterns: Add dependency patterns ?depends, ?conflicts, etc.
    Note that the -broken- variants are not implemented yet.
  * Rewrite of the kernel autoremoval code:
    - Determine autoremovable kernels at run-time (LP: #1615381), this fixes the
      issue where apt could consider a running kernel autoremovable
    - Automatically remove unused kernels on apt {full,dist}-upgrade.
      This helps ensuring that we don't run out of /boot space.
    - Only keep up to 3 (not 4) kernels.
      Ubuntu boot partitions were sized for 3 kernels, not 4.
  * Bump codenames to bullseye/hirsute and adjust -security codename for
    bullseye (Closes: #969932)
  * Ignore failures from immediate configuration. This does not change the
    actual installation ordering - we never passed the return code to the
    caller and installation went underway anyway if it could be ordered at a
    later stage, this just removes spurious after-the-fact errors.
    (Closes: #973305, #188161, #211075, #649588) (LP: #1871268)
  * Add support for Phased-Update-Percentage, previously used only by
    update-manager.
  * Implement update --error-on=any so that scripts can reliably check for
    transient failures as well. (Closes: #594813)

  [ Demi M. Obenour ]
  * test/integration/framework: Be compatible with Bash

  [ Vangelis Skarmoutsos ]
  * Greek program translation update

apt (2.1.15) unstable; urgency=medium

  [ Julian Andres Klode ]
  * Unroll pkgCache::sHash 8 time, break up dependency
  * Do not require libxxhash-dev for including pkgcachegen.h (Closes: #978171)

  [ David Kalnischkies ]
  * Proper URI encoding for config requests to our test webserver
  * Keep URIs encoded in the acquire system
  * Implement encoded URI handling in all methods
  * Don't re-encode encoded URIs in pkgAcqFile

  [ Helge Kreutzmann ]
  * German program translation update (Closes: #977938)

apt (2.1.14) unstable; urgency=medium

  * test: fixup for hash table size increase (changed output order)
  * Use XXH3 for cache, hash table hashing

apt (2.1.13) unstable; urgency=medium

  [ Debian Janitor ]
  * Apply multi-arch hints.
    + apt-doc, libapt-pkg-doc: Add Multi-Arch: foreign.

  [ Jordi Mallach ]
  * Fix typo in Catalan translation.

  [ David Kalnischkies ]
  * Prepare rred binary for external usage
  * Support reading compressed patches in rred direct call modes
  * Support compressed output from rred similar to apt-helper cat-file

  [ Julian Andres Klode ]
  * gitignore: Add /build and /obj-* build dirs
  * gitignore: Add .*.swp files
  * HexDigest: Silence -Wstringop-overflow
  * patterns: Terminate short pattern by ~ and !
  * SECURITY UPDATE: Integer overflow in parsing (LP: #1899193)
    - apt-pkg/contrib/arfile.cc: add extra checks.
    - apt-pkg/contrib/tarfile.cc: limit tar item sizes to 128 GiB
    - apt-pkg/deb/debfile.cc: limit control file sizes to 64 MiB
    - test/*: add tests.
    - CVE-2020-27350
  * Additional hardening:
    - apt-pkg/contrib/tarfile.cc: Limit size of long names and links to 1 MiB
  * Raise APT::Cache-HashtableSize to 196613

apt (2.1.12) unstable; urgency=medium

  [ Julian Andres Klode ]
  * pkgnames: Correctly set the default for AllNames to false (LP: #1876495)
  * pkgnames: Do not exclude virtual packages with --all-names
  * Remove expired domain that became nsfw from debian/changelog
  * Do not immediately configure m-a: same packages in lockstep (LP: #1871268)

  [ Américo Monteiro ]
  * Portuguese manpages translation update (Closes: #968414)

  [ David Kalnischkies ]
  * Rename CMake find_package helpers to avoid developer warnings
  * Install translated apt-patterns(7) man pages
  * Remove ancient versions support from apts postinst
  * Update libapt-pkg6.0 symbols file
  * Refresh lintian-overrides of apt and libapt-pkg-doc

apt (2.1.11) unstable; urgency=medium

  [ JCGoran ]
  * Fix "extended_states" typo in apt-mark(8) (Closes: #969086)

  [ Julian Andres Klode ]
  * doc: Bump Ubuntu release from focal to groovy
  * Do not produce late error if immediate configuration fails, just warn
    (Closes: #953260, #972552) (LP: #1871268)

  [ Frans Spiesschaert ]
  * Dutch manpages translation update (Closes: #970037)

apt (2.1.10) unstable; urgency=medium

  * Default Acquire::AllowReleaseInfoChange::Suite to "true" (Closes: #931566)
  * acquire: Do not hide _error messages in Fail()
  * Further improvements to HTTP method (Closes: #968220, verified against
    that server and the Canonical infra where it blocked buildds)
    - Do not use non-blocking local I/O - they don't do anything anyway,
      and we can't really use non-blocking I/O here because we need to be able
      to flush it.
    - Restore successful exits from Die() and rewrite Die() in a more
      comprehensible way, after careful code path analysis
    - http: Fully flush local file both before/after server read, avoiding
      both partial flush before sending requests to the server, as well as
      preventing leftover data before receiving from the server, which cause
      data left in the buffer.

apt (2.1.9) unstable; urgency=medium

  [ Julian Andres Klode ]
  * http: Fix infinite loop on read errors
  * basehttp: Correctly handle non-transient failure from RunData()
  * Do not retry on failure to fetch (Closes: #968163)

  [ Aleix Vidal i Gaya ]
  * updated catalan translations

apt (2.1.8) unstable; urgency=medium

  [ Julian Andres Klode ]
  * Fully deprecate apt-key, schedule removal for Q2/2022
  * apt-key: Allow depending on gpg instead of gnupg
  * Removal of racist terminology, except for two cases that still need consensus
  * Various fixes to http code:
    - http: Always Close() the connection in Die()
    - http: Die(): Merge flushing code from Flush()
    - http: Only return false for EOF if we actually did not read anything
    - http: Die(): Do not flush the buffer, error out instead
    - http: Finish copying data from server to file before sending stuff to server
    - http: On select timeout, error out directly, do not call Die()
    - http: Redesign reading of pending data
    - http: Always write to the file if there's something to write; this fixes
      a regression from removing the buffer flushing code
    Overall, there's hope this Closes: #959518. It reproduced a bit, but eventually
    snapshot.d.o ratelimiting kicked in and broke the test case.

  [ Nicolas Schier ]
  * Support marking all newly installed packages as automatically installed

apt (2.1.7) unstable; urgency=medium

  [ David Kalnischkies ]
  * Do not hardcode (wrong) group and mode in setup warning (Closes: #962310)
  * Do not sent our filename-provides trick to EDSP solvers (Closes: #962741)
  * Tell EDSP solvers about all installed pkgs ignoring arch
  * Deduplicate EDSP Provides line of M-A:foreign packages
  * Delay removals due to Conflicts until Depends are resolved
  * Filter out impossible solutions for protected propagation
  * Add dependency points in the resolver also to providers
  * Reorder config check before checking systemd for non-interactive http
  * Reorder config check before result looping for SRV parsing debug
  * Fix test due to display change in ls (coreutils 8.32)
  * Detect pkg-config-dpkghook failure in tests to avoid fallback (Closes: #964475)

  [ Américo Monteiro ]
  * Portuguese manpages translation update (Closes: #962483)

  [ Julian Andres Klode ]
  * Replace some magic 64*1024 with APT_BUFFER_SIZE
  * Add basic support for the Protected field

  [ Sergio Oller Moreno ]
  * Minor Catalan grammar typo

  [ Frans Spiesschaert ]
  * Dutch program translation update (Closes: #963008)

apt (2.1.6) unstable; urgency=medium

  [ David Kalnischkies ]
  * Fix small memory leak in MethodConfig
  * Consider protected packages for removal if they are marked as such
  * Consider if a fix is successful before claiming it is
  * Allow 20 instead of 10 loops for pkgProblemResolver
  * Deal with duplicates in the solution space of a dep

apt (2.1.5) unstable; urgency=medium

  [ David Kalnischkies ]
  * Reset candidate version explicitly for internal state-keeping
    (Closes: #961266)
  * Known-bad candidate versions are not an upgrade option
  * Keep status number if candidate is discarded for kept back display
  * Allow pkgDepCache to be asked to check internal consistency
  * Don't update candidate provides map if the same as current
  * Ensure EDSP doesn't use a dangling architecture string
  * Allow FMV SSE4.2 detection to succeed on clang
  * Mark PatternTreeParser::Node destructor as virtual

  [ Frans Spiesschaert ]
  * Dutch manpages translation update (Closes: #961431)

apt (2.1.4) unstable; urgency=medium

  [ David Kalnischkies ]
  * Check satisfiability for versioned provides, not providing version

apt (2.1.3) unstable; urgency=medium

  [ David Kalnischkies ]
  * Prefer use of O_TMPFILE in GetTempFile if available
  * Allow prefix to be a complete filename for GetTempFile
  * Properly handle interrupted write() call in ExtractTar
  * Skip reading data from tar members if nobody will look at it
  * Keep going if a dep is bad for user requests to improve errors
  * Support negative dependencies in VCI::FromDependency
  * Deal with protected solution providers first
  * Propagate protected to already satisfied conflicts (Closes: #960705)
  * Propagate protected to already satisfied dependencies
  * Recognize propagated protected in pkgProblemResolver

  [ Julian Andres Klode ]
  * private-search: Only use V.TranslatedDescription() if good (LP: #1877987)

apt (2.1.2) unstable; urgency=critical

  [ Julian Andres Klode ]
  * SECURITY UPDATE: Out of bounds read in ar, tar implementations (LP: #1878177)
    - apt-pkg/contrib/arfile.cc: Fix out-of-bounds read in member name
    - apt-pkg/contrib/arfile.cc: Fix out-of-bounds read on unterminated
      member names in error path
    - apt-pkg/contrib/extracttar.cc: Fix out-of-bounds read on unterminated
      member names in error path
    - CVE-2020-3810

  [ Frans Spiesschaert ]
  * Dutch program translation update (Closes: #960186)

apt (2.1.1) unstable; urgency=medium

  [ David Kalnischkies ]
  * Allow aptitude to MarkInstall broken packages via FromUser
  * Drop nowrap from po4a --porefs as it is no longer supported
  * Use "po4a --porefs file" instead of undocumented compat noline

  [ Artur Grącki ]
  * Fix typo in Polish translation of --help messages

apt (2.1.0) unstable; urgency=medium

  [ Frans Spiesschaert ]
  * Dutch manpages translation update (Closes: #956313)

  [ David Kalnischkies ]
  * Refactor MarkInstall fixing various or-group handling issues
    - Discard impossible candidate versions also for non-installed
    - Explore or-groups for Recommends further than first
    - Refactor and reorder MarkInstall code
    - Discard candidate if its dependencies can't be satisfied
    - Split up MarkInstall into private helper methods
    - Fail earlier on impossible Conflicts in MarkInstall
    - Propagate Protected flag to single-option dependencies
    - Prefer upgrading installed orgroup members
    - Protect a package while resolving in MarkInstall

  [ Julian Andres Klode ]
  * Reinstate * wildcards (Closes: #953531) (LP: #1872200)
  * apt list: Fix behavior of regex vs fnmatch vs wildcards

apt (2.0.2) unstable; urgency=medium

  [ Boyuan Yang ]
  * Simplified Chinese program translation update (Closes: #955023)

  [ Frans Spiesschaert ]
  * Dutch program translation update (Closes: #955505)

  [ Marco Ippolito ]
  * Fix gramma in apt(8): "by append(+ing) a" (Closes: #955412)

  [ Chris Leick ]
  * German manpage translation update
  * Fix "string match{ing,es}" and whitespace typo in apt-patterns(7)

  [ Julian Andres Klode ]
  * test/integration/apt.pem: Regenerate with SHA2 hashes to make the
    test work with stricter gnutls in Ubuntu which rejects SHA1
  * ubuntu: http: Add non-interactive to user agent if run by systemd
    (LP: #1825000)

apt (2.0.1) unstable; urgency=medium

  [ David Kalnischkies ]
  * Don't crash pattern matching sections if pkg has no section
  * Parse last line in deb file correctly by adding a newline

  [ Julian Andres Klode ]
  * apt-helper: Add analyze-pattern helper
  * Add color highlighting to E:/W:/N: prefixes (Closes: #953527)

  [ Алексей Шилин ]
  * Russian program translation update (Closes: #953804)

apt (2.0.0) unstable; urgency=medium

  * Upload to unstable - Happy APT 2.0 day!
  * GetLock: No strerror if it's just another process holding the lock
  * Show absolute time while waiting for lock instead of %, rework message

apt (1.9.12) experimental; urgency=medium

  * pkgcache: Add operator bool() to map_pointer
  * (temporarily) unhide pkgDPkgPM again to have python-apt compile

apt (1.9.11) experimental; urgency=medium

  [ Tomáš Janoušek ]
  * bash completion: Add autopurge command

  [ Tris Emmy Wilson ]
  * apt-mark: don't lie about successful marks

  [ Julian Andres Klode ]
  * apt(8): Wait for lock (Closes: #754103)
  * policy: Implement pinning by source package (Closes: #166032)
  * Initialize libgcrypt on first use (Closes: #949074)
  * Fix various compiler warnings
  * Bump ABI to 6.0; update symbols file; cleanup ABI:
    - Merge various function overloads together
    - Make stuff that should be virtual virtual
    - Default to hidden visibility
  * Code removals:
    - Use a 32-bit djb VersionHash instead of CRC-16
    - Remove CRC-16 implementation
  * Hardening:
    - tagfile: Check if memchr() returned null before using
    - tagfile: Check out-of-bounds access to Tags vector
  * Cache improvements:
    - Type safe cache: Replace map_pointer_t with map_pointer<T>
    - Extensibility: Add d-pointers to groups, packages, versions, and files
    - Prepare for package hashtable removal: Swap locations of hashtables

  [ Nis Martensen ]
  * apt-pkg/srcrecords.cc: 'source' means 'deb-src' in error message

  [ David Kalnischkies ]
  * Parse records including empty tag names correctly

apt (1.9.10) experimental; urgency=medium

  [ David Kalnischkies ]
  * Fix remaining usec vs sec time-delta calculation typos.
    Thanks to Trent W. Buck for initial patch (Closes: #950776)

  [ Julian Andres Klode ]
  * seccomp: Allow time64 variants (>402,<415) of allowed syscalls
    (Closes: #951012)
  * debian/control: Bump libseccomp-dev Build-Depends to >= 2.4.2
  * seccomp: Allow recvmmsg_time64() and futex_time64()
  * policy: Add SetPriority() methods
  * Revert "Add a Packages-Require-Authorization Release file field"

  [ Michael Vogt ]
  * doc: remove "WIP" from apt.8.xml

apt (1.9.9) experimental; urgency=medium

  * Widen regular expressions for versioned kernel packages (LP: #1607845)
  * Implement short patterns (patterns starting with ~)

apt (1.9.8) experimental; urgency=medium

  * pkgcache.cc: Mix PACKAGE_VERSION into the cache hash
  * mmap: Do not look for empty pool unless we need to
  * apt-verbatim.ent: Update ubuntu-codename from disco to focal
  * NewGroup: Create GrpIterator after allocation (fix segfault)

apt (1.9.7) experimental; urgency=medium

  * Trim trailing whitespace (thanks lintian-brush)
  * NewProvidesAllArch: Check if group is empty before using it.
    This caused automake-1.16 to not be provided by automake anymore,
    because apt wanted to add provides to packages in an empty automake-1.16
    group. LP: #1859952
  * Fix debian-rules-uses-deprecated-systemd-override.
    We accidentally managed to restart apt-daily{,-upgrade}.service
    again because our dh_systemd_start override was being ignored
    since we switched to debhelper 12. Override dh_installsystemd
    instead.

apt (1.9.6) experimental; urgency=medium

  [ Julian Andres Klode ]
  * gitlab-ci: Do not do coverage
  * gitlab-ci: Use ccache
  * satisfy: Fix segmentation fault when called with empty argument
  * Add support for GTest 1.9, do not fail silently if its missing
  * gtests: Fix netrc parser test regression from https-only changes
  * Macro cleanup:
    - Avoid #define _error, use anonymous C++ struct instead (Closes: #948338)
    - Rename _count() macro to APT_ARRAY_SIZE()
    - Remove various unused macros like MAX/MIN/ABS/APT_CONST
    - Only define likely/unlikely if APT_COMPILING_APT set
  * Performance: Avoid extra out-of-cache hash table deduplication for
    package names, this saved about 10-16% on gencaches in memory
  * acquire: Move queue startup after calling log's Start(), fixes abort()
    calls in python-apt
  * hashes: Use Libgcrypt for hashing purposes
    - Raise buffer size for Hashes::AddFD() from 4 KiB to 64 KiB
    - Convert users of {MD5,SHA1,SHA256,SHA512}Summation to use Hashes
    - Deprecate the Summation classes and mark them for removal
    - Remove includes of (md5|sha1|sha2).h headers
  * netrc: Add warning when ignoring entries for unencrypted protocols
  * apt(8): Disable regular expressions and fnmatch

  [ David Kalnischkies ]
  * Drop g++ build-dependency to help crossbuilding (Closes: #948201)

  [ Denis Mosolov ]
  * Fix typo in README.md

apt (1.9.5) experimental; urgency=medium

  [ Julian Andres Klode ]
  * Parse 'show' arguments for the 'info' alias as well (LP: #1843812)
  * patterns: Add base class for regular expression matching
  * patterns: Add ?version
  * patterns: Add ?source-name and ?source-version
  * patterns: Add ?archive
  * patterns: Add ?origin
  * patterns: Add ?any-version
  * patterns: Implement ?narrow(...), as ?any-version(?and(...))
  * patterns: Add ?all-versions
  * patterns: Add ?section
  * netrc: Restrict auth.conf entries to https by default (Closes: #945911)

  [ Anatoly Borodin ]
  * README.md: fix dead anonscm link

  [ Алексей Шилин ]
  * Search in all available description translations (Closes: #490000)
  * strutl: Add APT::String::DisplayLength() function
  * Fix progress bar width for multibyte charsets

  [ Chris Leick ]
  * German manpage translation update

  [ David Kalnischkies ]
  * Use correct filename on IMS-hit reverify for indices
  * Remove failed trusted signature instead of index on IMS hit

  [ Anthony Papillon ]
  * Fix a mistake in man french translation

apt (1.9.4) experimental; urgency=medium

  * CMake: Pass -Werror=return-type to gcc
  * CMake: Produce a fatal error if triehash could not be found
  * apt.systemd.daily: Do not numerically check if intervals equal 0
    (LP: #1840995)
  * srvrec: Use re-entrant resolver functions
  * Pass --abort-after=1 to dpkg when using --force-depends (Closes: #935910)
    (LP: #1844634)
  * Fix use of GTest to adjust for GTest 1.9

apt (1.9.3) experimental; urgency=medium

  * Fix segfault in pkgAcquire::Enqueue() with Acquire::Queue-Mode=access
    (LP: #1839714)
  * test: Use valgrind to ensure Acquire::Queue-Mode=access does not crash
  * Add initial support for package patterns (patterns on versions WIP)

apt (1.9.2) experimental; urgency=medium

  [ Julian Andres Klode ]
  * Improve locking messaging - pid and name, "do not remove lock file"

  [ Lynn Cyrin ]
  * Change a pronoun in the readme from `he` to `they`

  [ David Kalnischkies ]
  * Distribute host-less work based on backlog of the queues
  * Show details about the package with bad Provides
  * Apply various suggestions by cppcheck

apt (1.9.1) experimental; urgency=medium

  * RFC1123StrToTime: Accept const std::string& as first argument
  * Fix pkg-config-test autopkgtest

apt (1.9.0) experimental; urgency=medium

  [ Julian Andres Klode ]
  * CMakeLists.txt: Bump C++ standard version to C++14
  * debian: Update to debhelper-compat (= 12)
  * debian/rules: Do not use dh_install --list-missing (dh 12 porting)
  * Remove all the deprecated bits, merge various function prototypes together
  * prepare-release: Add merge-translations command
  * Use system-provided triehash
  * CI: Use unstable for now, as we need triehash package
  * Tighten dependencies from apt and apt-utils on libs
  * Add test case for local-only packages pinned to never
  * acq: worker: Move CurrentSize, TotalSize, ResumePoint to CurrentItem
  * apt-helper: Support multiple hashes for a file
  * Add 'explicit' to most single argument constructors
  * Get rid of pkgExtract and pkgFLCache
  * Merge libapt-inst into libapt-pkg
  * Use debDebFile to get control file instead of dpkg-deb
  * prepare-release: Add bump-abi command
  * Change soname to libapt-pkg.so.5.90
  * CMake: Enforce "override" use on overridden methods
  * debmetaindex: Use isspace_ascii() variant to normalize Signed-By
  * README.md: Quote -j <count> as code with backticks
  * apt-mark: Add hidden showheld alias for showhold
  * Mnor wording improvements in documentation
  * Make APT::StringView public, replace std::string with it in various places
  * Introduce apt satisfy and apt-get satisfy (Closes: #275379)
  * Run unifdef -DAPT_{8,9,10,15}_CLEANER_HEADERS
  * Adjust code for missing includes, and using std::string
  * Bump cache MajorVersion to 16

  [ Corentin Noël ]
  * Add pkg-config files for the apt-pkg and apt-inst libraries
    (Closes: #439121)

  [ Simon McVittie ]
  * vendor/getinfo: Iterate through vendors in lexicographic order
    (Closes: #924662)
  * vendor/getinfo: Don't assume that Ubuntu is the last vendor
    (Closes: #924662)

  [ Martin Michlmayr ]
  * Perform minor copy-editing on the docs

  [ Ivan Krylov ]
  * Mark apt-transport-https as M-A:foreign (Closes: #905141)

  [ David Kalnischkies ]
  * Don't limit cpu-limited queues to at most 10

  [ Stephen Kitt ]
  * apt-cache: only show solutions if displayed

  [ Brian Murray ]
  * Do not include squashfs file systems in df output. (LP: #1756595)

  [ Simon Körner ]
  * http: Fix Host header in proxied https connections
parent 888910da
No related branches found
No related tags found
1 merge request!32Manual initial sync of changes from Debian Bullseye
Pipeline #248054 passed
Showing with 164 additions and 736 deletions
image: debian:buster
image: debian:unstable
variables:
DEBIAN_FRONTEND: noninteractive
CCACHE_DIR: $CI_PROJECT_DIR/.ccache
CCACHE_BASEDIR: $CI_PROJECT_DIR
cache:
paths:
- .ccache
test as root:
stage: test
......@@ -8,29 +13,28 @@ test as root:
- adduser --home /home/travis travis --quiet --disabled-login --gecos "" --uid 1000
- rm -f /etc/dpkg/dpkg.cfg.d/excludes
- apt-get update
- apt-get install -qq build-essential expect gcovr sudo
- apt-get install -qq build-essential expect sudo ccache
- chmod -R o+rwX $PWD
- ./prepare-release travis-ci
- sudo -u travis mkdir build
- sudo -u travis env -C build cmake -DCMAKE_BUILD_TYPE=Coverage -G Ninja ..
- sudo -u travis ninja -C build
- sudo -u travis mkdir -p build .ccache
- sudo -u travis env -C build cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache -G Ninja ..
- sudo -u travis --preserve-env=CCACHE_DIR,CCACHE_BASEDIR ninja -C build
- CTEST_OUTPUT_ON_FAILURE=1 ninja -C build test
- unbuffer ./test/integration/run-tests -q -j 4
- gcovr
test as user:
image: i386/debian:unstable
stage: test
script:
- adduser --home /home/travis travis --quiet --disabled-login --gecos "" --uid 1000
- rm -f /etc/dpkg/dpkg.cfg.d/excludes
- apt-get update
- apt-get install -qq build-essential expect gcovr sudo
- apt-get install -qq build-essential expect sudo ccache
- chmod 755 /root
- chmod -R o+rwX $PWD
- ./prepare-release travis-ci
- sudo -u travis mkdir build
- sudo -u travis env -C build cmake -DCMAKE_BUILD_TYPE=Coverage -G Ninja ..
- sudo -u travis ninja -C build
- sudo -u travis mkdir -p build .ccache
- sudo -u travis env -C build cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_C_COMPILER_LAUNCHER=ccache -DCMAKE_CXX_COMPILER_LAUNCHER=ccache -G Ninja ..
- sudo -u travis --preserve-env=CCACHE_DIR,CCACHE_BASEDIR ninja -C build
- sudo -u travis CTEST_OUTPUT_ON_FAILURE=1 ninja -C build test
- sudo -u travis unbuffer ./test/integration/run-tests -q -j 4
- sudo -u travis gcovr
# CMake support for target-based function multiversioning
#
# Copyright (C) 2019 Canonical Ltd
#
# Author: Julian Andres Klode <jak@debian.org>.
#
# Permission is hereby granted, free of charge, to any person
# obtaining a copy of this software and associated documentation files
# (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge,
# publish, distribute, sublicense, and/or sell copies of the Software,
# and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be
# included in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS
# BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN
# ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
# CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
function(check_cxx_target var target code)
check_cxx_source_compiles(
"
__attribute__((target(\"${target}\"))) static int foo() { ${code} return 1; }
__attribute__((target(\"default\"))) static int foo() { ${code} return 0; }
int main() { return foo(); }
" ${var})
endfunction()
......@@ -318,7 +318,7 @@ function(add_update_po4a target pot header)
add_custom_target(${target}
COMMAND po4a --previous --no-backups --force --no-translations
--msgmerge-opt --add-location=file
--porefs noline,wrap
--porefs file
--package-name=${PROJECT_NAME}-doc --package-version=${PACKAGE_VERSION}
--msgid-bugs-address=${PACKAGE_MAIL} po4a.conf
${WRITE_HEADER}
......
# - Try to find Berkeley DB
# Once done this will define
#
# BERKELEY_DB_FOUND - system has Berkeley DB
# BERKELEY_DB_INCLUDE_DIRS - the Berkeley DB include directory
# BERKELEY_DB_LIBRARIES - Link these to use Berkeley DB
# BERKELEY_DB_DEFINITIONS - Compiler switches required for using Berkeley DB
# BERKELEY_FOUND - system has Berkeley DB
# BERKELEY_INCLUDE_DIRS - the Berkeley DB include directory
# BERKELEY_LIBRARIES - Link these to use Berkeley DB
# BERKELEY_DEFINITIONS - Compiler switches required for using Berkeley DB
# Copyright (c) 2006, Alexander Dymo, <adymo@kdevelop.org>
# Copyright (c) 2016, Julian Andres Klode <jak@debian.org>
......@@ -35,7 +35,7 @@
# We need NO_DEFAULT_PATH here, otherwise CMake helpfully picks up the wrong
# db.h on BSD systems instead of the Berkeley DB one.
find_path(BERKELEY_DB_INCLUDE_DIRS db.h
find_path(BERKELEY_INCLUDE_DIRS db.h
${CMAKE_INSTALL_FULL_INCLUDEDIR}/db5
/usr/local/include/db5
/usr/include/db5
......@@ -51,9 +51,9 @@ find_path(BERKELEY_DB_INCLUDE_DIRS db.h
NO_DEFAULT_PATH
)
find_library(BERKELEY_DB_LIBRARIES NAMES db db-5)
find_library(BERKELEY_LIBRARIES NAMES db db-5)
include(FindPackageHandleStandardArgs)
find_package_handle_standard_args(Berkeley "Could not find Berkeley DB >= 4.1" BERKELEY_DB_INCLUDE_DIRS BERKELEY_DB_LIBRARIES)
# show the BERKELEY_DB_INCLUDE_DIRS and BERKELEY_DB_LIBRARIES variables only in the advanced view
mark_as_advanced(BERKELEY_DB_INCLUDE_DIRS BERKELEY_DB_LIBRARIES)
find_package_handle_standard_args(Berkeley "Could not find Berkeley DB >= 4.1" BERKELEY_INCLUDE_DIRS BERKELEY_LIBRARIES)
# show the BERKELEY_INCLUDE_DIRS and BERKELEY_LIBRARIES variables only in the advanced view
mark_as_advanced(BERKELEY_INCLUDE_DIRS BERKELEY_LIBRARIES)
# - Try to find GCRYPT
# Once done, this will define
#
# GCRYPT_FOUND - system has GCRYPT
# GCRYPT_INCLUDE_DIRS - the GCRYPT include directories
# GCRYPT_LIBRARIES - the GCRYPT library
find_package(PkgConfig)
pkg_check_modules(GCRYPT_PKGCONF libgcrypt)
find_path(GCRYPT_INCLUDE_DIRS
NAMES gcrypt.h
PATHS ${GCRYPT_PKGCONF_INCLUDE_DIRS}
)
find_library(GCRYPT_LIBRARIES
NAMES gcrypt
PATHS ${GCRYPT_PKGCONF_LIBRARY_DIRS}
)
include(FindPackageHandleStandardArgs)
find_package_handle_standard_args(GCRYPT DEFAULT_MSG GCRYPT_INCLUDE_DIRS GCRYPT_LIBRARIES)
mark_as_advanced(GCRYPT_INCLUDE_DIRS GCRYPT_LIBRARIES)
File moved
# - Try to find XXHASH
# Once done, this will define
#
# XXHASH_FOUND - system has XXHASH
# XXHASH_INCLUDE_DIRS - the XXHASH include directories
# XXHASH_LIBRARIES - the XXHASH library
find_package(PkgConfig)
pkg_check_modules(XXHASH_PKGCONF libxxhash)
find_path(XXHASH_INCLUDE_DIRS
NAMES xxhash.h
PATHS ${XXHASH_PKGCONF_INCLUDE_DIRS}
)
find_library(XXHASH_LIBRARIES
NAMES xxhash
PATHS ${XXHASH_PKGCONF_LIBRARY_DIRS}
)
include(FindPackageHandleStandardArgs)
find_package_handle_standard_args(XXHASH DEFAULT_MSG XXHASH_INCLUDE_DIRS XXHASH_LIBRARIES)
mark_as_advanced(XXHASH_INCLUDE_DIRS XXHASH_LIBRARIES)
File moved
......@@ -51,17 +51,17 @@ function(add_vendor_file)
endfunction()
# Add symbolic links to a file
function(add_slaves destination master)
set(slaves "")
foreach(slave ${ARGN})
add_custom_command(OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/${slave}
COMMAND ${CMAKE_COMMAND} -E create_symlink ${master} ${CMAKE_CURRENT_BINARY_DIR}/${slave})
list(APPEND slaves ${CMAKE_CURRENT_BINARY_DIR}/${slave})
function(add_links directory target)
set(link_names "")
foreach(link_name ${ARGN})
add_custom_command(OUTPUT ${CMAKE_CURRENT_BINARY_DIR}/${link_name}
COMMAND ${CMAKE_COMMAND} -E create_symlink ${target} ${CMAKE_CURRENT_BINARY_DIR}/${link_name})
list(APPEND link_names ${CMAKE_CURRENT_BINARY_DIR}/${link_name})
endforeach()
STRING(REPLACE "/" "-" master "${master}")
add_custom_target(${master}-slaves ALL DEPENDS ${slaves})
install(FILES ${slaves} DESTINATION ${destination})
STRING(REPLACE "/" "-" target "${target}")
add_custom_target(${target}-link_names ALL DEPENDS ${link_names})
install(FILES ${link_names} DESTINATION ${directory})
endfunction()
# Generates a simple version script versioning everything with current SOVERSION
......
......@@ -136,8 +136,8 @@ function(apt_add_update_po)
list(APPEND potfiles ${CMAKE_CURRENT_BINARY_DIR}/${domain}.pot)
endforeach()
get_filename_component(master_name ${output} NAME_WE)
add_custom_target(nls-${master_name}
get_filename_component(primary_name ${output} NAME_WE)
add_custom_target(nls-${primary_name}
COMMAND msgcomm --sort-by-file --add-location=file
--more-than=0 --output=${output}
${potfiles}
......@@ -154,11 +154,11 @@ function(apt_add_update_po)
endif()
add_custom_target(update-po-${langcode}
COMMAND msgmerge -q --previous --update --backup=none ${translation} ${output}
DEPENDS nls-${master_name}
DEPENDS nls-${primary_name}
)
add_dependencies(update-po update-po-${langcode})
endforeach()
add_dependencies(update-po nls-${master_name})
add_dependencies(update-po nls-${primary_name})
endfunction()
function(apt_add_po_statistics excluded)
......
......@@ -2,6 +2,9 @@
/* Internationalization macros for apt. This header should be included last
in each C file. */
#ifndef APT_I18N_H
#define APT_I18N_H
// Set by autoconf
#cmakedefine USE_NLS
......@@ -19,11 +22,13 @@
# define N_(x) x
#else
// apt will not use any gettext
# define setlocale(a, b)
# define textdomain(a)
# define bindtextdomain(a, b)
extern "C" inline char* setlocale(int, const char*) throw() { return nullptr; }
extern "C" inline char* textdomain(const char*) throw() { return nullptr; }
extern "C" inline char* bindtextdomain(const char*, const char*) throw() { return nullptr; }
extern "C" inline char* dgettext(const char*, const char* msg) throw() { return const_cast<char*>(msg); }
# define _(x) x
# define P_(msg,plural,n) (n == 1 ? msg : plural)
# define N_(x) x
# define dgettext(d, m) m
#endif
#endif
......@@ -64,6 +64,9 @@
/* The mail address to reach upstream */
#define PACKAGE_MAIL "${PACKAGE_MAIL}"
/* Guard for code that should only be emitted when compiling apt */
#define APT_COMPILING_APT
/* Various directories */
#cmakedefine CMAKE_INSTALL_FULL_BINDIR "${CMAKE_INSTALL_FULL_BINDIR}"
#cmakedefine STATE_DIR "${STATE_DIR}"
......@@ -77,14 +80,5 @@
/* Group of the root user */
#cmakedefine ROOT_GROUP "${ROOT_GROUP}"
/* defined if __builtin_ia32_crc32{s,d}i() exists in an sse4.2 target */
#cmakedefine HAVE_FMV_SSE42_AND_CRC32
#cmakedefine HAVE_FMV_SSE42_AND_CRC32DI
#define APT_8_CLEANER_HEADERS
#define APT_9_CLEANER_HEADERS
#define APT_10_CLEANER_HEADERS
#define APT_15_CLEANER_HEADERS
/* unrolling is faster combined with an optimizing compiler */
#define SHA2_UNROLL_TRANSFORM
......@@ -11,6 +11,7 @@ include_directories(${PROJECT_BINARY_DIR}/include)
enable_testing()
option(WITH_DOC "Build documentation." ON)
option(WITH_TESTS "Build tests" ON)
option(USE_NLS "Localisation support." ON)
set(CMAKE_MODULE_PATH "${PROJECT_SOURCE_DIR}/CMake")
......@@ -38,6 +39,12 @@ find_package(Iconv REQUIRED)
find_package(Perl REQUIRED)
find_program(TRIEHASH_EXECUTABLE NAMES triehash)
if (NOT TRIEHASH_EXECUTABLE)
message(FATAL_ERROR "Could not find triehash executable")
endif()
if(USE_NLS)
find_package(Intl REQUIRED)
link_libraries(${Intl_LIBRARIES})
......@@ -50,7 +57,7 @@ add_definitions(${LFS_DEFINITIONS})
link_libraries(${LFS_LIBRARIES})
# Set compiler flags
set(CMAKE_CXX_STANDARD 11)
set(CMAKE_CXX_STANDARD 14)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
set(CMAKE_VISIBILITY_INLINES_HIDDEN 1)
......@@ -69,10 +76,12 @@ add_optional_compile_options(Wnoexcept)
add_optional_compile_options(Wsign-promo)
add_optional_compile_options(Wundef)
add_optional_compile_options(Wdouble-promotion)
add_optional_compile_options(Wsuggest-override)
add_optional_compile_options(Werror=suggest-override)
add_optional_compile_options(Werror=return-type)
# apt-ftparchive dependencies
find_package(BerkeleyDB REQUIRED)
if (BERKELEY_DB_FOUND)
find_package(Berkeley REQUIRED)
if (BERKELEY_FOUND)
set(HAVE_BDB 1)
endif()
......@@ -104,7 +113,7 @@ if (LZ4_FOUND)
set(HAVE_LZ4 1)
endif()
find_package(Zstd)
find_package(ZSTD)
if (ZSTD_FOUND)
set(HAVE_ZSTD 1)
endif()
......@@ -120,11 +129,14 @@ if (SYSTEMD_FOUND)
set(HAVE_SYSTEMD 1)
endif()
find_package(Seccomp)
find_package(SECCOMP)
if (SECCOMP_FOUND)
set(HAVE_SECCOMP 1)
endif()
find_package(GCRYPT REQUIRED)
find_package(XXHASH REQUIRED)
# Mount()ing and stat()ing and friends
check_symbol_exists(statfs sys/vfs.h HAVE_VFS_H)
check_include_files(sys/params.h HAVE_PARAMS_H)
......@@ -178,22 +190,18 @@ if (NOT HAVE_SIGHANDLER_T)
endif()
# Handle resolving
check_function_exists(res_init HAVE_LIBC_RESOLV)
check_function_exists(res_ninit HAVE_LIBC_RESOLV)
if(HAVE_LIBC_RESOLV)
set(RESOLV_LIBRARIES)
else()
set(RESOLV_LIBRARIES -lresolv)
endif()
# Check multiversioning
include(CheckCxxTarget)
check_cxx_target(HAVE_FMV_SSE42_AND_CRC32 "sse4.2" "__builtin_ia32_crc32si(0, 1llu);")
check_cxx_target(HAVE_FMV_SSE42_AND_CRC32DI "sse4.2" "__builtin_ia32_crc32di(0, 1llu);")
# Configure some variables like package, version and architecture.
set(PACKAGE ${PROJECT_NAME})
set(PACKAGE_MAIL "APT Development Team <deity@lists.debian.org>")
set(PACKAGE_VERSION "1.8.2.2")
set(PACKAGE_VERSION "2.2.0")
string(REGEX MATCH "^[0-9.]+" PROJECT_VERSION ${PACKAGE_VERSION})
if (NOT DEFINED DPKG_DATADIR)
execute_process(COMMAND ${PERL_EXECUTABLE} -MDpkg -e "print $Dpkg::DATADIR;"
......@@ -229,7 +237,6 @@ configure_file(CMake/apti18n.h.in ${PROJECT_BINARY_DIR}/include/apti18n.h)
add_subdirectory(vendor)
add_subdirectory(apt-pkg)
add_subdirectory(apt-private)
add_subdirectory(apt-inst)
add_subdirectory(cmdline)
add_subdirectory(completions)
add_subdirectory(doc)
......
FROM debian:buster
FROM debian:unstable
COPY . /tmp
WORKDIR /tmp
RUN sed -i s#://deb.debian.org#://cdn-fastly.deb.debian.org# /etc/apt/sources.list \
......
APT
===
apt is the main commandline package manager for Debian and its derivatives.
It provides commandline tools for searching and managing as well as querying
apt is the main command-line package manager for Debian and its derivatives.
It provides command-line tools for searching and managing as well as querying
information about packages as well as low-level access to all features
provided by the libapt-pkg and libapt-inst libraries which higher-level
package managers can depend upon.
......@@ -13,24 +13,24 @@ Included tools are:
from authenticated sources and for installation, upgrade and
removal of packages together with their dependencies
* **apt-cache** for querying available information about installed
as well as installable packages
as well as available packages
* **apt-cdrom** to use removable media as a source for packages
* **apt-config** as an interface to the configuration settings
* **apt-key** as an interface to manage authentication keys
* **apt-extracttemplates** to be used by debconf to prompt for configuration
questions before installation
* **apt-ftparchive** creates Packages and other index files
needed to publish an archive of debian packages
needed to publish an archive of deb packages
* **apt-sortpkgs** is a Packages/Sources file normalizer
* **apt** is a high-level commandline interface for better interactive usage
* **apt** is a high-level command-line interface for better interactive usage
The libraries libapt-pkg and libapt-inst are also maintained as part of this project,
alongside various additional binaries like the acquire-methods used by them.
alongside various additional binaries like the acquire methods used by them.
Bindings for Python ([python-apt](https://tracker.debian.org/pkg/python-apt)) and
Perl ([libapt-pkg-perl](https://tracker.debian.org/pkg/libapt-pkg-perl)) are available as separated projects.
Discussion happens mostly on [the mailinglist](mailto:deity@lists.debian.org) ([archive](https://lists.debian.org/deity/)) and on [IRC](irc://irc.oftc.net/debian-apt).
Our bugtracker as well as a general overview can be found at the [Debian Tracker page](https://tracker.debian.org/pkg/apt).
Discussion happens mostly on [the mailing list](mailto:deity@lists.debian.org) ([archive](https://lists.debian.org/deity/)) and on [IRC](irc://irc.oftc.net/debian-apt).
Our bug tracker as well as a general overview can be found at the [Debian Tracker page](https://tracker.debian.org/pkg/apt).
Contributing
......@@ -46,7 +46,7 @@ are encouraged to do as well.
### Coding
APT uses cmake. To start building, you need to run
APT uses CMake. To start building, you need to run
cmake <path to source directory>
......@@ -55,15 +55,15 @@ run:
cmake .
Then you can use make as you normally would (pass -j <count> to perform <count>
Then you can use make as you normally would (pass `-j <count>` to perform `<count>`
jobs in parallel).
You can also use the Ninja generator of cmake, to do that pass
You can also use the Ninja generator of CMake, to do that pass
-G Ninja
to the cmake invocation, and then use ninja instead of make.
The source code uses in most parts a relatively uncommon indent convention,
namely 3 spaces with 8 space tab (see [doc/style.txt](https://anonscm.debian.org/git/apt/apt.git/tree/doc/style.txt) for more on this).
namely 3 spaces with 8 space tab (see [doc/style.txt](./doc/style.txt) for more on this).
Adhering to it avoids unnecessary code-churn destroying history (aka: `git blame`)
and you are therefore encouraged to write patches in this style.
Your editor can surely help you with this, for vim the settings would be
......@@ -73,23 +73,23 @@ Your editor can surely help you with this, for vim the settings would be
### Translations
While we welcome contributions here, we highly encourage you to contact the [Debian Internationalization (i18n) team](https://wiki.debian.org/Teams/I18n).
Various language teams have formed which can help you creating, maintaining
and improving a translation, while we could only do a basic syntax check of the
Various language teams have formed which can help you create, maintain
and improve a translation, while we could only do a basic syntax check of the
file format…
Further more, Translating APT is split into two independent parts:
Further more, translating APT is split into two independent parts:
The program translation, meaning the messages printed by the tools,
as well as the manpages and other documentation shipped with APT.
as well as the manual pages and other documentation shipped with APT.
### Bug triage
Software tools like APT which are used by thousands of users every
day have a steady flow of incoming bugreports. Not all of them are really
bugs in APT: It can be packaging bugs like failing maintainer scripts a
user reports against apt, because apt was the command he executed leading
to this failure or various wishlist items for new features. Given enough time
also the occasional duplicate enters the system.
Our bugtracker is therefore full with open bugreports which are waiting for you! ;)
Software tools like APT, which are used by thousands of users every
day, have a steady flow of incoming bug reports. Not all of them are really
bugs in APT: It can be packaging bugs, like failing maintainer scripts, that a
user reports against apt, because apt was the command they executed that lead
to this failure; or various wishlist items for new features. Given enough time
the occasional duplicate enters the system as well.
Our bug tracker is therefore full with open bug reports which are waiting for you! ;)
Testing
-------
......@@ -101,17 +101,17 @@ automatically inserts an rpath so the binaries find the correct libraries.
Note that you have to invoke CMake with the right install prefix set (e.g.
`-DCMAKE_INSTALL_PREFIX=/usr`) to have your build find and use the right files
by default or alternatively set the locations at runtime via an `APT_CONFIG`
by default or alternatively set the locations at run-time via an `APT_CONFIG`
configuration file.
### Integration tests
There is an extensive integration testsuite available which can be run via:
There is an extensive integration test suite available which can be run via:
$ ./test/integration/run-tests
Each test can also be run individually as well. The tests are very noisy by
default, especially so while running all of them it might be beneficial to
default, especially so while running all of them; it might be beneficial to
enabling quiet (`-q`) or very quiet (`-qq`) mode. The tests can also be run in
parallel via `-j X` where `X` is the number of jobs to run.
......@@ -121,7 +121,7 @@ run them on [Travis CI](https://travis-ci.org/) and
[Shippable](https://shippable.com/) as well as via autopkgtests e.g. on
[Debian Continuous Integration](https://ci.debian.net/packages/a/apt/).
A testcase here is a shellscript embedded in a framework creating an environment in which
A test case here is a shell script embedded in a framework creating an environment in which
apt tools can be used naturally without root-rights to test every aspect of its behavior
itself as well as in conjunction with dpkg and other tools while working with packages.
......@@ -137,24 +137,24 @@ Debugging
---------
APT does many things, so there is no central debug mode which could be
activated. It uses instead various config-options to activate debug output
activated. Instead, it uses various configuration options to activate debug output
in certain areas. The following describes some common scenarios and generally
useful options, but is in no way exhaustive.
Note that you should *NEVER* use these settings as root to avoid accidents.
Note that, to avoid accidents, you should *NEVER* use these settings as root.
Simulation mode (`-s`) is usually sufficient to help you run apt as a non-root user.
### Using different state files
If a dependency solver bug is reported, but can't be reproduced by the
triager easily, it is beneficial to ask the reporter for the
`/var/lib/dpkg/status` file, which includes the packages installed on the
If a dependency solver bug is reported, but can't easily be reproduced by the
triager, it is beneficial to ask the reporter for the
`/var/lib/dpkg/status` file which includes the packages installed on the
system and in which version. Such a file can then be used via the option
`dir::state::status`. Beware of different architecture settings!
Bugreports usually include this information in the template. Assuming you
Bug reports usually include this information in the template. Assuming you
already have the `Packages` files for the architecture (see `sources.list`
manpage for the `arch=` option) you can change to a different architecture
with a config file like:
with a configuration file like:
APT::Architecture "arch1";
#clear APT::Architectures;
......@@ -173,8 +173,8 @@ APT works in its internal resolver in two stages: First all packages are visited
and marked for installation, keep back or removal. Option `Debug::pkgDepCache::Marker`
shows this. This also decides which packages are to be installed to satisfy dependencies,
which can be seen by `Debug::pkgDepCache::AutoInstall`. After this is done, we might
be in a situation in which two packages want to be installed, but only on of them can be.
It is the job of the pkgProblemResolver to decide which of two packages 'wins' and can
be in a situation in which two packages want to be installed, but only one of them can be.
It is the job of the `pkgProblemResolver` to decide which of two packages 'wins' and can
therefore decide what has to happen. You can see the contenders as well as their fight and
the resulting resolution with `Debug::pkgProblemResolver`.
......@@ -184,13 +184,13 @@ Various binaries (called 'methods') are tasked with downloading files. The Acqui
talks to them via simple text protocol. Depending on which side you want to see, either
`Debug::pkgAcquire::Worker` or `Debug::Acquire::http` (or similar) will show the messages.
The integration tests use a simple self-built webserver which also logs. If you find that
the http(s) methods do not behave like they should be try to implement this behavior in the
The integration tests use a simple self-built web server (`webserver`) which also logs. If you find that
the http(s) methods do not behave like they should be try to implement this behavior in
webserver for simpler and more controlled testing.
### Installation order
Dependencies are solved, packages downloaded: Everything read for the installation!
Dependencies are solved, packages downloaded: Everything is ready for the installation!
The last step in the chain is often forgotten, but still very important:
Packages have to be installed in a particular order so that their dependencies are
satisfied, but at the same time you don't want to install very important and optional
......@@ -207,9 +207,9 @@ Additional documentation
Many more things could and should be said about APT and its usage but are more
targeted at developers of related programs or only of special interest.
* [Protocol specification of APTs communication with external dependency solvers (EDSP)](./doc/external-dependency-solver-protocol.md)
* [Protocol specification of APTs communication with external installation planners (EIPP)](./doc/external-installation-planner-protocol.md)
* [Howto use and configure APT to acquire additional files in 'update' operations](./doc/acquire-additional-files.md)
* [Protocol specification of APT's communication with external dependency solvers (EDSP)](./doc/external-dependency-solver-protocol.md)
* [Protocol specification of APT's communication with external installation planners (EIPP)](./doc/external-installation-planner-protocol.md)
* [How to use and configure APT to acquire additional files in 'update' operations](./doc/acquire-additional-files.md)
* [Download and package installation progress reporting details](./doc/progress-reporting.md)
* [Remarks on DNS SRV record support in APT](./doc/srv-records-support.md)
* [Protocol specification of APT interfacing with external hooks via JSON](./doc/json-hooks-protocol.md)
......@@ -8,5 +8,4 @@
<libs>
@build_path@/apt-pkg/
@build_path@/apt-inst/
</libs>
# Include apt-pkg directly, as some files have #include <system.h>
include_directories(${PROJECT_BINARY_DIR}/include/apt-pkg)
# Set the version of the library
set(MAJOR 2.0)
set(MINOR 0)
set(APT_INST_MAJOR ${MAJOR} PARENT_SCOPE)
# Definition of the C++ files used to build the library - note that this
# is expanded at CMake time, so you have to rerun cmake if you add or remove
# a file (you can just run cmake . in the build directory)
file(GLOB_RECURSE library "*.cc")
file(GLOB_RECURSE headers "*.h")
# Create a library using the C++ files
add_library(apt-inst SHARED ${library})
# Link the library and set the SONAME
target_link_libraries(apt-inst PUBLIC apt-pkg ${CMAKE_THREAD_LIBS_INIT})
target_link_libraries(apt-inst PRIVATE ${CMAKE_THREAD_LIBS_INIT})
set_target_properties(apt-inst PROPERTIES VERSION ${MAJOR}.${MINOR})
set_target_properties(apt-inst PROPERTIES SOVERSION ${MAJOR})
add_version_script(apt-inst)
# Install the library and the headers
install(TARGETS apt-inst LIBRARY DESTINATION ${CMAKE_INSTALL_LIBDIR})
install(FILES ${headers} DESTINATION ${CMAKE_INSTALL_INCLUDEDIR}/apt-pkg)
flatify(${PROJECT_BINARY_DIR}/include/apt-pkg/ "${headers}")
- Replacing directories with files
dpkg permits this with the weak condition that the directory is owned only
by the package. APT requires that the directory have no files that are not
owned by the package. Replaces are specifically not checked to prevent
file list corruption.
// -*- mode: cpp; mode: fold -*-
// Description /*{{{*/
/* ######################################################################
Archive Extraction Directory Stream
Extraction for each file is a bit of an involved process. Each object
undergoes an atomic backup, overwrite, erase sequence. First the
object is unpacked to '.dpkg.new' then the original is hardlinked to
'.dpkg.tmp' and finally the new object is renamed to overwrite the old
one. From an external perspective the file never ceased to exist.
After the archive has been successfully unpacked the .dpkg.tmp files
are erased. A failure causes all the .dpkg.tmp files to be restored.
Decisions about unpacking go like this:
- Store the original filename in the file listing
- Resolve any diversions that would effect this file, all checks
below apply to the diverted name, not the real one.
- Resolve any symlinked configuration files.
- If the existing file does not exist then .dpkg-tmp is checked for.
[Note, this is reduced to only check if a file was expected to be
there]
- If the existing link/file is not a directory then it is replaced
regardless
- If the existing link/directory is being replaced by a directory then
absolutely nothing happens.
- If the existing link/directory is being replaced by a link then
absolutely nothing happens.
- If the existing link/directory is being replaced by a non-directory
then this will abort if the package is not the sole owner of the
directory. [Note, this is changed to not happen if the directory
non-empty - that is, it only includes files that are part of this
package - prevents removing user files accidentally.]
- If the non-directory exists in the listing database and it
does not belong to the current package then an overwrite condition
is invoked.
As we unpack we record the file list differences in the FL cache. If
we need to unroll the FL cache knows which files have been unpacked
and can undo. When we need to erase then it knows which files have not
been unpacked.
##################################################################### */
/*}}}*/
// Include Files /*{{{*/
#include <config.h>
#include <apt-pkg/debversion.h>
#include <apt-pkg/dirstream.h>
#include <apt-pkg/error.h>
#include <apt-pkg/extract.h>
#include <apt-pkg/filelist.h>
#include <apt-pkg/fileutl.h>
#include <apt-pkg/mmap.h>
#include <apt-pkg/pkgcache.h>
#include <iostream>
#include <string>
#include <dirent.h>
#include <errno.h>
#include <stdio.h>
#include <string.h>
#include <sys/stat.h>
#include <apti18n.h>
/*}}}*/
using namespace std;
static const char *TempExt = "dpkg-tmp";
//static const char *NewExt = "dpkg-new";
// Extract::pkgExtract - Constructor /*{{{*/
// ---------------------------------------------------------------------
/* */
pkgExtract::pkgExtract(pkgFLCache &FLCache,pkgCache::VerIterator Ver) :
FLCache(FLCache), Ver(Ver)
{
FLPkg = FLCache.GetPkg(Ver.ParentPkg().Name(),true);
if (FLPkg.end() == true)
return;
Debug = true;
}
/*}}}*/
// Extract::DoItem - Handle a single item from the stream /*{{{*/
// ---------------------------------------------------------------------
/* This performs the setup for the extraction.. */
bool pkgExtract::DoItem(Item &Itm, int &/*Fd*/)
{
/* Strip any leading/trailing /s from the filename, then copy it to the
temp buffer and re-apply the leading / We use a class variable
to store the new filename for use by the three extraction funcs */
char *End = FileName+1;
const char *I = Itm.Name;
for (; *I != 0 && *I == '/'; I++);
*FileName = '/';
for (; *I != 0 && End < FileName + sizeof(FileName); I++, End++)
*End = *I;
if (End + 20 >= FileName + sizeof(FileName))
return _error->Error(_("The path %s is too long"),Itm.Name);
for (; End > FileName && End[-1] == '/'; End--);
*End = 0;
Itm.Name = FileName;
/* Lookup the file. Nde is the file [group] we are going to write to and
RealNde is the actual node we are manipulating. Due to diversions
they may be entirely different. */
pkgFLCache::NodeIterator Nde = FLCache.GetNode(Itm.Name,End,0,false,false);
pkgFLCache::NodeIterator RealNde = Nde;
// See if the file is already in the file listing
unsigned long FileGroup = RealNde->File;
for (; RealNde.end() == false && FileGroup == RealNde->File; RealNde++)
if (RealNde.RealPackage() == FLPkg)
break;
// Nope, create an entry
if (RealNde.end() == true)
{
RealNde = FLCache.GetNode(Itm.Name,End,FLPkg.Offset(),true,false);
if (RealNde.end() == true)
return false;
RealNde->Flags |= pkgFLCache::Node::NewFile;
}
/* Check if this entry already was unpacked. The only time this should
ever happen is if someone has hacked tar to support capabilities, in
which case this needs to be modified anyhow.. */
if ((RealNde->Flags & pkgFLCache::Node::Unpacked) ==
pkgFLCache::Node::Unpacked)
return _error->Error(_("Unpacking %s more than once"),Itm.Name);
if (Nde.end() == true)
Nde = RealNde;
/* Consider a diverted file - We are not permitted to divert directories,
but everything else is fair game (including conf files!) */
if ((Nde->Flags & pkgFLCache::Node::Diversion) != 0)
{
if (Itm.Type == Item::Directory)
return _error->Error(_("The directory %s is diverted"),Itm.Name);
/* A package overwriting a diversion target is just the same as
overwriting a normally owned file and is checked for below in
the overwrites mechanism */
/* If this package is trying to overwrite the target of a diversion,
that is never, ever permitted */
pkgFLCache::DiverIterator Div = Nde.Diversion();
if (Div.DivertTo() == Nde)
return _error->Error(_("The package is trying to write to the "
"diversion target %s/%s"),Nde.DirN(),Nde.File());
// See if it is us and we are following it in the right direction
if (Div->OwnerPkg != FLPkg.Offset() && Div.DivertFrom() == Nde)
{
Nde = Div.DivertTo();
End = FileName + snprintf(FileName,sizeof(FileName)-20,"%s/%s",
Nde.DirN(),Nde.File());
if (End <= FileName)
return _error->Error(_("The diversion path is too long"));
}
}
// Deal with symlinks and conf files
if ((RealNde->Flags & pkgFLCache::Node::NewConfFile) ==
pkgFLCache::Node::NewConfFile)
{
string Res = flNoLink(Itm.Name);
if (Res.length() > sizeof(FileName))
return _error->Error(_("The path %s is too long"),Res.c_str());
if (Debug == true)
clog << "Followed conf file from " << FileName << " to " << Res << endl;
Itm.Name = strcpy(FileName,Res.c_str());
}
/* Get information about the existing file, and attempt to restore
a backup if it does not exist */
struct stat LExisting;
bool EValid = false;
if (lstat(Itm.Name,&LExisting) != 0)
{
// This is bad news.
if (errno != ENOENT)
return _error->Errno("stat",_("Failed to stat %s"),Itm.Name);
// See if we can recover the backup file
if (Nde.end() == false)
{
char Temp[sizeof(FileName)];
snprintf(Temp,sizeof(Temp),"%s.%s",Itm.Name,TempExt);
if (rename(Temp,Itm.Name) != 0 && errno != ENOENT)
return _error->Errno("rename",_("Failed to rename %s to %s"),
Temp,Itm.Name);
if (stat(Itm.Name,&LExisting) != 0)
{
if (errno != ENOENT)
return _error->Errno("stat",_("Failed to stat %s"),Itm.Name);
}
else
EValid = true;
}
}
else
EValid = true;
/* If the file is a link we need to stat its destination, get the
existing file modes */
struct stat Existing = LExisting;
if (EValid == true && S_ISLNK(Existing.st_mode))
{
if (stat(Itm.Name,&Existing) != 0)
{
if (errno != ENOENT)
return _error->Errno("stat",_("Failed to stat %s"),Itm.Name);
Existing = LExisting;
}
}
// We pretend a non-existing file looks like it is a normal file
if (EValid == false)
Existing.st_mode = S_IFREG;
/* Okay, at this point 'Existing' is the stat information for the
real non-link file */
/* The only way this can be a no-op is if a directory is being
replaced by a directory or by a link */
if (S_ISDIR(Existing.st_mode) != 0 &&
(Itm.Type == Item::Directory || Itm.Type == Item::SymbolicLink))
return true;
/* Non-Directory being replaced by non-directory. We check for over
writes here. */
if (Nde.end() == false)
{
if (HandleOverwrites(Nde) == false)
return false;
}
/* Directory being replaced by a non-directory - this needs to see if
the package is the owner and then see if the directory would be
empty after the package is removed [ie no user files will be
erased] */
if (S_ISDIR(Existing.st_mode) != 0)
{
if (CheckDirReplace(Itm.Name) == false)
return _error->Error(_("The directory %s is being replaced by a non-directory"),Itm.Name);
}
if (Debug == true)
clog << "Extract " << string(Itm.Name,End) << endl;
/* if (Count != 0)
return _error->Error(_("Done"));*/
return true;
}
/*}}}*/
// Extract::Finished - Sequence finished, erase the temp files /*{{{*/
// ---------------------------------------------------------------------
/* */
APT_PURE bool pkgExtract::Finished()
{
return true;
}
/*}}}*/
// Extract::Aborted - Sequence aborted, undo all our unpacking /*{{{*/
// ---------------------------------------------------------------------
/* This undoes everything that was done by all calls to the DoItem method
and restores the File Listing cache to its original form. It bases its
actions on the flags value for each node in the cache. */
bool pkgExtract::Aborted()
{
if (Debug == true)
clog << "Aborted, backing out" << endl;
pkgFLCache::NodeIterator Files = FLPkg.Files();
map_ptrloc *Last = &FLPkg->Files;
/* Loop over all files, restore those that have been unpacked from their
dpkg-tmp entries */
while (Files.end() == false)
{
// Locate the hash bucket for the node and locate its group head
pkgFLCache::NodeIterator Nde(FLCache,FLCache.HashNode(Files));
for (; Nde.end() == false && Files->File != Nde->File; Nde++);
if (Nde.end() == true)
return _error->Error(_("Failed to locate node in its hash bucket"));
if (snprintf(FileName,sizeof(FileName)-20,"%s/%s",
Nde.DirN(),Nde.File()) <= 0)
return _error->Error(_("The path is too long"));
// Deal with diversions
if ((Nde->Flags & pkgFLCache::Node::Diversion) != 0)
{
pkgFLCache::DiverIterator Div = Nde.Diversion();
// See if it is us and we are following it in the right direction
if (Div->OwnerPkg != FLPkg.Offset() && Div.DivertFrom() == Nde)
{
Nde = Div.DivertTo();
if (snprintf(FileName,sizeof(FileName)-20,"%s/%s",
Nde.DirN(),Nde.File()) <= 0)
return _error->Error(_("The diversion path is too long"));
}
}
// Deal with overwrites+replaces
for (; Nde.end() == false && Files->File == Nde->File; Nde++)
{
if ((Nde->Flags & pkgFLCache::Node::Replaced) ==
pkgFLCache::Node::Replaced)
{
if (Debug == true)
clog << "De-replaced " << FileName << " from " << Nde.RealPackage()->Name << endl;
Nde->Flags &= ~pkgFLCache::Node::Replaced;
}
}
// Undo the change in the filesystem
if (Debug == true)
clog << "Backing out " << FileName;
// Remove a new node
if ((Files->Flags & pkgFLCache::Node::NewFile) ==
pkgFLCache::Node::NewFile)
{
if (Debug == true)
clog << " [new node]" << endl;
pkgFLCache::Node *Tmp = Files;
Files++;
*Last = Tmp->NextPkg;
Tmp->NextPkg = 0;
FLCache.DropNode(Tmp - FLCache.NodeP);
}
else
{
if (Debug == true)
clog << endl;
Last = &Files->NextPkg;
Files++;
}
}
return true;
}
/*}}}*/
// Extract::Fail - Extraction of a file Failed /*{{{*/
// ---------------------------------------------------------------------
/* */
bool pkgExtract::Fail(Item &Itm,int Fd)
{
return pkgDirStream::Fail(Itm,Fd);
}
/*}}}*/
// Extract::FinishedFile - Finished a file /*{{{*/
// ---------------------------------------------------------------------
/* */
bool pkgExtract::FinishedFile(Item &Itm,int Fd)
{
return pkgDirStream::FinishedFile(Itm,Fd);
}
/*}}}*/
// Extract::HandleOverwrites - See if a replaces covers this overwrite /*{{{*/
// ---------------------------------------------------------------------
/* Check if the file is in a package that is being replaced by this
package or if the file is being overwritten. Note that if the file
is really a directory but it has been erased from the filesystem
this will fail with an overwrite message. This is a limitation of the
dpkg file information format.
XX If a new package installs and another package replaces files in this
package what should we do? */
bool pkgExtract::HandleOverwrites(pkgFLCache::NodeIterator Nde,
bool DiverCheck)
{
pkgFLCache::NodeIterator TmpNde = Nde;
unsigned long DiverOwner = 0;
unsigned long FileGroup = Nde->File;
for (; Nde.end() == false && FileGroup == Nde->File; Nde++)
{
if ((Nde->Flags & pkgFLCache::Node::Diversion) != 0)
{
/* Store the diversion owner if this is the forward direction
of the diversion */
if (DiverCheck == true)
DiverOwner = Nde.Diversion()->OwnerPkg;
continue;
}
pkgFLCache::PkgIterator FPkg(FLCache,Nde.RealPackage());
if (FPkg.end() == true || FPkg == FLPkg)
continue;
/* This tests trips when we are checking a diversion to see
if something has already been diverted by this diversion */
if (FPkg.Offset() == DiverOwner)
continue;
// Now see if this package matches one in a replace depends
pkgCache::DepIterator Dep = Ver.DependsList();
bool Ok = false;
for (; Dep.end() == false; ++Dep)
{
if (Dep->Type != pkgCache::Dep::Replaces)
continue;
// Does the replaces apply to this package?
if (strcmp(Dep.TargetPkg().Name(),FPkg.Name()) != 0)
continue;
/* Check the version for match. I do not think CurrentVer can be
0 if we are here.. */
pkgCache::PkgIterator Pkg = Dep.TargetPkg();
if (Pkg->CurrentVer == 0)
{
_error->Warning(_("Overwrite package match with no version for %s"),Pkg.Name());
continue;
}
// Replaces is met
if (debVS.CheckDep(Pkg.CurrentVer().VerStr(),Dep->CompareOp,Dep.TargetVer()) == true)
{
if (Debug == true)
clog << "Replaced file " << Nde.DirN() << '/' << Nde.File() << " from " << Pkg.Name() << endl;
Nde->Flags |= pkgFLCache::Node::Replaced;
Ok = true;
break;
}
}
// Negative Hit
if (Ok == false)
return _error->Error(_("File %s/%s overwrites the one in the package %s"),
Nde.DirN(),Nde.File(),FPkg.Name());
}
/* If this is a diversion we might have to recurse to process
the other side of it */
if ((TmpNde->Flags & pkgFLCache::Node::Diversion) != 0)
{
pkgFLCache::DiverIterator Div = TmpNde.Diversion();
if (Div.DivertTo() == TmpNde)
return HandleOverwrites(Div.DivertFrom(),true);
}
return true;
}
/*}}}*/
// Extract::CheckDirReplace - See if this directory can be erased /*{{{*/
// ---------------------------------------------------------------------
/* If this directory is owned by a single package and that package is
replacing it with something non-directoryish then dpkg allows this.
We increase the requirement to be that the directory is non-empty after
the package is removed */
bool pkgExtract::CheckDirReplace(string Dir,unsigned int Depth)
{
// Looping?
if (Depth > 40)
return false;
if (Dir[Dir.size() - 1] != '/')
Dir += '/';
DIR *D = opendir(Dir.c_str());
if (D == 0)
return _error->Errno("opendir",_("Unable to read %s"),Dir.c_str());
string File;
for (struct dirent *Dent = readdir(D); Dent != 0; Dent = readdir(D))
{
// Skip some files
if (strcmp(Dent->d_name,".") == 0 ||
strcmp(Dent->d_name,"..") == 0)
continue;
// Look up the node
File = Dir + Dent->d_name;
pkgFLCache::NodeIterator Nde = FLCache.GetNode(File.c_str(),
File.c_str() + File.length(),0,false,false);
// The file is not owned by this package
if (Nde.end() != false || Nde.RealPackage() != FLPkg)
{
closedir(D);
return false;
}
// See if it is a directory
struct stat St;
if (lstat(File.c_str(),&St) != 0)
{
closedir(D);
return _error->Errno("lstat",_("Unable to stat %s"),File.c_str());
}
// Recurse down directories
if (S_ISDIR(St.st_mode) != 0)
{
if (CheckDirReplace(File,Depth + 1) == false)
{
closedir(D);
return false;
}
}
}
// No conflicts
closedir(D);
return true;
}
/*}}}*/
// -*- mode: cpp; mode: fold -*-
// Description /*{{{*/
/* ######################################################################
Archive Extraction Directory Stream
This Directory Stream implements extraction of an archive into the
filesystem. It makes the choices on what files should be unpacked and
replaces as well as guiding the actual unpacking.
When the unpacking sequence is completed one of the two functions,
Finished or Aborted must be called.
##################################################################### */
/*}}}*/
#ifndef PKGLIB_EXTRACT_H
#define PKGLIB_EXTRACT_H
#include <apt-pkg/dirstream.h>
#include <apt-pkg/filelist.h>
#include <apt-pkg/pkgcache.h>
#include <string>
class pkgExtract : public pkgDirStream
{
pkgFLCache &FLCache;
pkgCache::VerIterator Ver;
pkgFLCache::PkgIterator FLPkg;
char FileName[1024];
bool Debug;
bool HandleOverwrites(pkgFLCache::NodeIterator Nde,
bool DiverCheck = false);
bool CheckDirReplace(std::string Dir,unsigned int Depth = 0);
public:
virtual bool DoItem(Item &Itm,int &Fd) APT_OVERRIDE;
virtual bool Fail(Item &Itm,int Fd) APT_OVERRIDE;
virtual bool FinishedFile(Item &Itm,int Fd) APT_OVERRIDE;
bool Finished();
bool Aborted();
pkgExtract(pkgFLCache &FLCache,pkgCache::VerIterator Ver);
};
#endif
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment