Project Number Date
test_Tails_ISO_feature-trixie 27 11 Jun 2025, 17:54

Failures Overview

The following summary displays scenarios that failed.

14.330
Scenario Erasure of memory freed by killed userspace processes
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.002
Steps
Given I have started Tails from DVD without network and logged in 8.257
And I prepare Tails for memory erasure tests 0.959
When I start a process allocating 128 MiB of memory with a known pattern 1.414
Then patterns cover at least 128 MiB in the guest's memory 3.699
99.610% of the expected size (128 MiB) has the pattern, but more than 100.000% was expected.
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/erase_memory.rb:186:in `/^patterns cover at least (\d+) MiB in the guest's memory$/'
features/erase_memory.feature:17:in `Then patterns cover at least 128 MiB in the guest's memory'
Output 1

Pattern coverage: 99.610% (127 MiB out of 128 MiB reference memory)

When I kill the allocating process 0.000
Then I find very few patterns in the guest's memory after 5 seconds 0.000
After features/support/hooks.rb:330 5.436

SCENARIO FAILED: 'Erasure of memory freed by killed userspace processes' (at time 00:38:39)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/00:38:39_Erasure_of_memory_freed_by_killed_userspace_processes.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/00:38:39_Erasure_of_memory_freed_by_killed_userspace_processes.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/00:38:39_Erasure_of_memory_freed_by_killed_userspace_processes.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/00:38:39_Erasure_of_memory_freed_by_killed_userspace_processes.journal

After features/support/hooks.rb:100 0.000
12:4.873
Scenario I set up Additional Software when installing a package without persistent partition and the package is installed next time I start Tails
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:513 0.005
Steps
Given I start Tails from a freshly installed USB drive with an administration password and the network is plugged and I login 1:51.694
And I update APT using apt 11.720
And I install "popularity-contest" using apt 26.367
Then I am proposed to add the "popularity-contest" package to my Additional Software 7.499
When I create a persistent storage and activate the Additional Software feature 21.358
Then Additional Software is correctly configured for package "popularity-contest" 2.809
When I shutdown Tails and wait for the computer to power off 16.291
And I start Tails from USB drive "__internal" with network unplugged 1:49.417
And I enable persistence 28.092
And I log in to a new session 27.678
And the Additional Software installation service has started 58.670
Then I am notified that the installation succeeded 5:3.275
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/common_steps.rb:595:in `/^I see the "(.+)" notification(?: after at most (\d+) seconds)?$/'
./features/step_definitions/additional_software_packages.rb:28:in `/^I am notified that the installation succeeded$/'
features/additional_software_packages.feature:39:in `Then I am notified that the installation succeeded'
And Additional Software is correctly configured for package "popularity-contest" 0.000
And the package "popularity-contest" is installed after Additional Software has been started 0.000
After features/support/hooks.rb:521 0.036
After features/support/hooks.rb:330 6.354

SCENARIO FAILED: 'I set up Additional Software when installing a package without persistent partition and the package is installed next time I start Tails' (at time 01:30:21)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.file_content_var_log_boot.log

Content of /run/live-additional-software/log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.file_content_run_live-additional-software_log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:30:21_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.journal

After features/support/hooks.rb:100 0.012
25:14.158
Scenario The Additional Software dpkg hook notices when persistence is locked down while installing a package
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:513 0.005
Steps
Given a computer 0.173
And I start Tails from USB drive "__internal" and I login with an administration password 2:56.425
And I update APT using apt 11.533
When I install "makepp" using apt 22:6.027
Tor
Tor operation failed (despite retrying 10 times) with
Timeout::Error: try_for() timeout expired
Last ignored exception was: ExecutionFailedInVM: Command failed: dpkg -s 'makepp' 2>/dev/null | grep -qs '^Status:.*installed$'
error code: 1
stdout: 
stderr: .
<false> is not true.
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/apt.rb:69:in `block in wait_for_package_installation'
./features/support/helpers/misc_helpers.rb:78:in `block (2 levels) in try_for'
./features/support/helpers/misc_helpers.rb:66:in `loop'
./features/support/helpers/misc_helpers.rb:66:in `block in try_for'
./features/support/helpers/misc_helpers.rb:65:in `try_for'
./features/step_definitions/apt.rb:68:in `wait_for_package_installation'
./features/step_definitions/apt.rb:86:in `block (3 levels) in <top (required)>'
./features/step_definitions/apt.rb:82:in `block (2 levels) in <top (required)>'
./features/support/helpers/misc_helpers.rb:211:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/apt.rb:81:in `block in <top (required)>' (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/apt.rb:81:in `/^I install "(.+)" using apt$/'
features/additional_software_packages.feature:48:in `When I install "makepp" using apt'
Then the Additional Software dpkg hook has been run for package "makepp" and notices the persistence is locked 0.000
And the package "makepp" is installed 0.000
After features/support/hooks.rb:521 0.031
After features/support/hooks.rb:330 5.279

SCENARIO FAILED: 'The Additional Software dpkg hook notices when persistence is locked down while installing a package' (at time 01:55:41)

Exception thrown while trying to save /tmp/TailsToaster/artifact.file_content_run_live-additional-software_log: RemoteShell::ServerFailure: FileNotFoundError: [Errno 2] No such file or directory: '/run/live-additional-software/log'

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.file_content_var_log_boot.log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/01:55:41_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.journal

After features/support/hooks.rb:100 0.012
5:49.308
Scenario My Additional Software list is configurable through a GUI or through notifications when I install or remove packages with APT or Synaptic
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:513 0.005
Steps
Given a computer 0.169
And I start Tails from USB drive "__internal" and I login with persistence enabled and an administration password 4:34.369
When I uninstall "popularity-contest" using apt 24.899
And I accept removing "popularity-contest" from Additional Software 1.730
Then "popularity-contest" is not in the list of Additional Software 0.734
When I start Synaptic 47.405
The
The Dogtail init script raised: SearchError: descendant of [application | gnome-shell]: child with name='Authentication Required' roleName='label'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:764:in `gnome_shell_unlock_dialog'
./features/step_definitions/common_steps.rb:782:in `deal_with_polkit_prompt'
./features/step_definitions/apt.rb:142:in `/^I start Synaptic$/'
features/additional_software_packages.feature:59:in `When I start Synaptic'
And I update APT using Synaptic 0.000
And I install "cowsay" using Synaptic 0.000
And I accept adding "cowsay" to Additional Software 0.000
Then Additional Software is correctly configured for package "cowsay" 0.000
When I uninstall "cowsay" using apt 0.000
And I refuse removing "cowsay" from Additional Software 0.000
Then Additional Software is correctly configured for package "cowsay" 0.000
When I start "Additional Software" via GNOME Activities Overview 0.000
And I remove "cowsay" from the list of Additional Software using Additional Software GUI 0.000
Then "cowsay" is not in the list of Additional Software 0.000
When I install "cowsay" using apt 0.000
And I refuse adding "cowsay" to Additional Software 0.000
Then "cowsay" is not in the list of Additional Software 0.000
After features/support/hooks.rb:521 0.029
After features/support/hooks.rb:330 7.628

SCENARIO FAILED: 'My Additional Software list is configurable through a GUI or through notifications when I install or remove packages with APT or Synaptic' (at time 02:01:36)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.file_content_var_log_boot.log

Content of /run/live-additional-software/log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.file_content_run_live-additional-software_log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:01:36_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.journal

After features/support/hooks.rb:100 0.007
Tags: @product
2:29.555
Scenario Installing an old version of Tails to a pristine USB drive
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given a computer 0.181
And I create a 7200 MiB disk named "old" 0.042
And I plug USB drive "old" 1.048
And I write an old version of the Tails USB image to disk "old" 32.041
When I start Tails from USB drive "old" with network unplugged 1:56.242
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:25:in `When I start Tails from USB drive "old" with network unplugged'
Then the boot device has safe access rights 0.000
And Tails is running from USB drive "old" 0.000
And there is no persistence partition on USB drive "old" 0.000
And process "udev-watchdog" is running 0.000
And udev-watchdog is monitoring the correct device 0.000
And I unplug USB drive "old" 0.000
After features/support/hooks.rb:330 6.363

SCENARIO FAILED: 'Installing an old version of Tails to a pristine USB drive' (at time 02:20:35)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:20:35_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:20:35_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:20:35_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:20:35_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.journal

After features/support/hooks.rb:100 0.010
Tags: @product
1:23.869
Scenario Creating a persistent partition with the old Tails USB installation
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given a computer 0.178
And I start Tails from USB drive "old" with network unplugged and I login 1:23.690
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:36:in `And I start Tails from USB drive "old" with network unplugged and I login'
Then Tails is running from USB drive "old" 0.000
And I create a persistent partition 0.000
And I take note of which tps features are available 0.000
Then a Tails persistence partition exists on USB drive "old" 0.000
And I shutdown Tails and wait for the computer to power off 0.000
After features/support/hooks.rb:330 5.789

SCENARIO FAILED: 'Creating a persistent partition with the old Tails USB installation' (at time 02:22:06)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:22:06_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:22:06_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:22:06_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:22:06_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.journal

After features/support/hooks.rb:100 0.011
Tags: @product
1:20.150
Scenario Writing files to a read/write-enabled persistent partition with the old Tails USB installation
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given a computer 0.178
And I start Tails from USB drive "old" with network unplugged and I login with persistence enabled 1:19.972
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:46:in `And I start Tails from USB drive "old" with network unplugged and I login with persistence enabled'
Then Tails is running from USB drive "old" 0.000
And all tps features are active 0.000
When I write some files expected to persist 0.000
# Verify that our baseline for the next scenarios is sane
And all persistent filesystems have safe access rights 0.000
And all persistence configuration files have safe access rights 0.000
And all persistent directories from the old Tails version have safe access rights 0.000
And I take note of which tps features are available 0.000
And I shutdown Tails and wait for the computer to power off 0.000
# XXX: how does guestfs work vs snapshots?
Then only the expected files are present on the persistence partition on USB drive "old" 0.000
After features/support/hooks.rb:330 5.696

SCENARIO FAILED: 'Writing files to a read/write-enabled persistent partition with the old Tails USB installation' (at time 02:23:32)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:23:32_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:23:32_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:23:32_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:23:32_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.journal

After features/support/hooks.rb:100 0.008
Tags: @product
2:5.116
Scenario Booting Tails from a USB drive upgraded from USB with persistence enabled
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given a computer 0.172
And I start Tails from USB drive "to_upgrade" with network unplugged and I login with persistence enabled 2:4.944
The
The Dogtail init script raised: SearchError: descendant of [application | Welcome to Tails!]: child with roleName='password text'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/usb.rb:671:in `/^I try to enable persistence( with the changed passphrase)?$/'
./features/step_definitions/usb.rb:703:in `/^I enable persistence( with the changed passphrase)?$/'
./features/step_definitions/common_steps.rb:283:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:73:in `And I start Tails from USB drive "to_upgrade" with network unplugged and I login with persistence enabled'
Then all tps features from the old Tails version are active 0.000
And Tails is running from USB drive "to_upgrade" 0.000
And the boot device has safe access rights 0.000
And the expected persistent files created with the old Tails version are present in the filesystem 0.000
And all persistent directories from the old Tails version have safe access rights 0.000
After features/support/hooks.rb:330 5.754

SCENARIO FAILED: 'Booting Tails from a USB drive upgraded from USB with persistence enabled' (at time 02:28:38)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:28:38_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:28:38_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:28:38_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:28:38_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.journal

After features/support/hooks.rb:100 0.009
22:52.033
Scenario Upgrading an initial Tails installation with an incremental upgrade
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 13.004
And no SquashFS delta is installed 0.213
And Tails is fooled to think that version 6.0~testoverlayfs was initially installed 0.741
And Tails is fooled to think it is running version 6.0~testoverlayfs 0.067
And the file system changes introduced in version 6.2~testoverlayfs are not present 0.329
And the file system changes introduced in version 6.3~testoverlayfs are not present 0.526
When the network is plugged 0.078
And Tor is ready 22.780
Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs 22:14.290
Tor
Tor operation failed (despite retrying 10 times) with
FindFailed: can not find any of the patterns ["TailsUpgraderUpgradeTo6.2~testoverlayfs.png", "TailsUpgraderFailure.png"] on the screen (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/usb.rb:1301:in `/^I am proposed to install an incremental upgrade to version (.+)$/'
features/usb_upgrade.feature:90:in `Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs'
And I can successfully install the incremental upgrade to version 6.2~testoverlayfs 0.000
Given I shutdown Tails and wait for the computer to power off 0.000
When I start Tails from USB drive "__internal" with network unplugged and I login with persistence enabled 0.000
Then Tails is running version 6.2~testoverlayfs 0.000
And all tps features are active 0.000
And the file system changes introduced in version 6.2~testoverlayfs are present 0.000
And only the 6.2~testoverlayfs SquashFS delta is installed 0.000
# Our IUK sets a release date that can make Tor bootstrapping impossible
Given Tails system time is magically synchronized 0.000
# We'll really install Tails_amd64_6.0~testoverlayfs_to_6.3~testoverlayfs.iuk
# but we need some way to force upgrading a second time in a row
# even if only the initially installed version is considered
And Tails is fooled to think that version 6.1~testoverlayfs was initially installed 0.000
When the network is plugged 0.000
And Tor is ready 0.000
Then I am proposed to install an incremental upgrade to version 6.3~testoverlayfs 0.000
And I can successfully install the incremental upgrade to version 6.3~testoverlayfs 0.000
Given I shutdown Tails and wait for the computer to power off 0.000
When I start Tails from USB drive "__internal" with network unplugged and I login with persistence enabled 0.000
Then Tails is running version 6.3~testoverlayfs 0.000
And all tps features are active 0.000
And the file system changes introduced in version 6.3~testoverlayfs are present 0.000
And only the 6.3~testoverlayfs SquashFS delta is installed 0.000
# Regression test for #17425 (i.e. the Upgrader would propose
# upgrading to the version that's already running)
Given Tails system time is magically synchronized 0.000
And Tails is fooled to think that version 6.1~testoverlayfs was initially installed 0.000
When the network is plugged 0.000
And Tor is ready 0.000
Then the Upgrader considers the system as up-to-date 0.000
# Regression test on #8158 (i.e. the IUK's filesystem is not part of the Unsafe Browser's chroot)
And I successfully start the Unsafe Browser 0.000
And the file system changes introduced in version 6.3~testoverlayfs are present in the Unsafe Browser's chroot 0.000
After features/support/hooks.rb:330 6.157

SCENARIO FAILED: 'Upgrading an initial Tails installation with an incremental upgrade' (at time 02:51:36)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:51:36_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:51:36_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:51:36_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/02:51:36_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.journal

After features/support/hooks.rb:100 0.000
22:44.254
Scenario Upgrading a Tails whose signing key is outdated
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.008
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 12.177
And Tails is fooled to think that version 6.0~testoverlayfs was initially installed 0.408
And Tails is fooled to think it is running version 6.0~testoverlayfs 0.101
And the signing key used by the Upgrader is outdated 0.084
But a current signing key is available on our website 0.000
When the network is plugged 0.084
And Tor is ready 19.831
Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs 22:11.565
Tor
Tor operation failed (despite retrying 10 times) with
FindFailed: can not find any of the patterns ["TailsUpgraderUpgradeTo6.2~testoverlayfs.png", "TailsUpgraderFailure.png"] on the screen (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/usb.rb:1301:in `/^I am proposed to install an incremental upgrade to version (.+)$/'
features/usb_upgrade.feature:134:in `Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs'
After features/support/hooks.rb:330 5.487

SCENARIO FAILED: 'Upgrading a Tails whose signing key is outdated' (at time 03:14:27)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:14:27_Upgrading_a_Tails_whose_signing_key_is_outdated.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:14:27_Upgrading_a_Tails_whose_signing_key_is_outdated.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:14:27_Upgrading_a_Tails_whose_signing_key_is_outdated.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:14:27_Upgrading_a_Tails_whose_signing_key_is_outdated.journal

After features/support/hooks.rb:100 0.000
45.957
Scenario The Tails documentation launcher works when offline
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.003
Steps
Given I have started Tails from DVD without network and logged in 8.519
When I start "Tails Documentation" via GNOME Activities Overview 10.086
Then the Tor Browser has started 27.351
cannot find TorBrowserNewTabButton.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/browser.rb:479:in `/^the Tor Browser (?:has started|starts)$/'
features/documentation.feature:7:in `Then the Tor Browser has started'
Then "Tails - Documentation" has loaded in the Tor Browser 0.000
After features/support/hooks.rb:330 8.709

SCENARIO FAILED: 'The Tails documentation launcher works when offline' (at time 03:25:50)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:25:50_The_Tails_documentation_launcher_works_when_offline.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:25:50_The_Tails_documentation_launcher_works_when_offline.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:25:50_The_Tails_documentation_launcher_works_when_offline.journal

After features/support/hooks.rb:100 0.000
Tags: @product
2:50.423
Scenario Outline GPT backup corruption with a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails without network from a USB drive with a persistent partition and stopped at Tails Greeter's login screen 12.173
And I corrupt the boot device's GPT backup header 0.468
And I power off the computer 0.330
When I start the computer 1.035
Then the computer boots Tails 1:51.115
When I log in to a new session 32.489
And all notifications have disappeared 12.580
Then I am recommended to migrate to a new USB stick due to partitioning errors 0.230
<nil> was expected to not be nil. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:84:in `/^I am recommended to migrate to a new USB stick due to partitioning errors$/'
features/hardware_failure.feature:45:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
features/hardware_failure.feature:41:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.057

SCENARIO FAILED: 'GPT backup corruption with a persistent partition, Examples (#1)' (at time 03:41:39)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:41:39_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:41:39_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:41:39_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:41:39_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.journal

After features/support/hooks.rb:100 0.009
Tags: @product
2:46.375
Scenario Outline GPT backup corruption with a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition and stopped at Tails Greeter's login screen 10.546
And I corrupt the boot device's GPT backup partition table 0.369
And I power off the computer 0.330
When I start the computer 1.004
Then the computer boots Tails 1:48.576
When I log in to a new session 31.871
And all notifications have disappeared 13.381
Then I am recommended to migrate to a new USB stick due to partitioning errors 0.295
<nil> was expected to not be nil. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:84:in `/^I am recommended to migrate to a new USB stick due to partitioning errors$/'
features/hardware_failure.feature:46:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
features/hardware_failure.feature:41:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.298

SCENARIO FAILED: 'GPT backup corruption with a persistent partition, Examples (#2)' (at time 03:44:32)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:44:32_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:44:32_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:44:32_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:44:32_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.journal

After features/support/hooks.rb:100 0.008
Tags: @product
3:42.444
Scenario GPT backup corruption without a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.003
Steps
Given a computer 0.166
And I set Tails to boot with options "test_gpt_corruption=gpt_backup,gpt_backup_table" 0.000
And I temporarily create a 7200 MiB disk named "temp" 0.043
And I plug USB drive "temp" 1.057
And I write the Tails USB image to disk "temp" 33.501
When I start Tails from USB drive "temp" with network unplugged 2:25.361
Then Tails is running from USB drive "temp" 0.379
And the Greeter forbids creating a persistent partition 0.086
When I log in to a new session 29.302
And all notifications have disappeared 12.294
Then I am recommended to reinstall Tails due to partitioning errors 0.252
<"OK"> was expected to include
<"Creation of Persistent Storage has been disabled">. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:94:in `/^I am recommended to reinstall Tails due to partitioning errors$/'
features/hardware_failure.feature:59:in `Then I am recommended to reinstall Tails due to partitioning errors'
# We are gonna verify the dialog again so we need to clean up the
# first instance.
And I close the "zenity" window 0.000
And I am told that Persistent Storage cannot be created 0.000
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.281

SCENARIO FAILED: 'GPT backup corruption without a persistent partition' (at time 03:48:20)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:48:20_GPT_backup_corruption_without_a_persistent_partition.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:48:20_GPT_backup_corruption_without_a_persistent_partition.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:48:20_GPT_backup_corruption_without_a_persistent_partition.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:48:20_GPT_backup_corruption_without_a_persistent_partition.journal

After features/support/hooks.rb:100 0.108
Tags: @product
8.770
Scenario The included APT repository keys are up-to-date
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails from DVD without network and logged in 8.526
Then the keys trusted by APT are valid for the next 3 months 0.243
Command failed: apt-key adv --batch --with-colons --fingerprint --list-key
error code: 127
stdout: 
stderr: /bin/sh: 1: apt-key: not found
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/checks.rb:19:in `assert_all_keys_are_valid_for_n_months'
./features/step_definitions/checks.rb:8:in `/^the keys trusted by APT are valid for the next (\d+) months$/'
features/keys.feature:11:in `Then the keys trusted by APT are valid for the next 3 months'
After features/support/hooks.rb:330 5.374

SCENARIO FAILED: 'The included APT repository keys are up-to-date' (at time 03:58:47)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:58:47_The_included_APT_repository_keys_are_up-to-date.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:58:47_The_included_APT_repository_keys_are_up-to-date.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:58:47_The_included_APT_repository_keys_are_up-to-date.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/03:58:47_The_included_APT_repository_keys_are_up-to-date.journal

After features/support/hooks.rb:100 0.000
Tags: @product @doc
2:8.217
Scenario Manual filesystem repair fails
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.008
Steps
Given I have started Tails without network from a USB drive with a persistent partition and stopped at Tails Greeter's login screen 9.498
And the Persistent Storage filesystem is corrupted beyond what e2fsck can repair 0.145
When I try to enable persistence 0.349
Then the Welcome Screen tells me that filesystem errors were found on the Persistent Storage 34.680
When I repair the filesystem of the Persistent Storage 1.485
Then the Welcome Screen tells me that it failed to repair the Persistent Storage 0.066
When I log in to a new session 32.778
And all notifications have disappeared 32.514
# Now both WhisperBack and Tor Browser has started at the same
# time, so we do not know which window is focused. The steps about
# Tor Browser care about that, but the ones about WhisperBack do
# not, so we first deal with WhisperBack and then kill it to
# ensure that Tor Browser is focused.
Then WhisperBack is prefilled for fsck with summary: "Failed to repair the file system of your Persistent Storage" 1.623
And the file "/var/lib/gdm3/post-greeter-whisperback.json" is empty 2.803
When I close the "whisperback" window 1.917
Then the Tor Browser starts 10.354
cannot find TorBrowserNewTabButton.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/browser.rb:479:in `/^the Tor Browser (?:has started|starts)$/'
features/persistence.feature:201:in `Then the Tor Browser starts'
And "Tails - Recovering data from the Persistent Storage when it has file system errors" has loaded in the Tor Browser 0.000
After features/support/hooks.rb:330 6.362

SCENARIO FAILED: 'Manual filesystem repair fails' (at time 05:45:59)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:45:59_Manual_filesystem_repair_fails.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:45:59_Manual_filesystem_repair_fails.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:45:59_Manual_filesystem_repair_fails.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:45:59_Manual_filesystem_repair_fails.journal

After features/support/hooks.rb:100 0.000
50.891
Scenario Chatting with some friend over XMPP
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:513 0.003
Steps
Given I have started Tails from DVD and logged in and the network is connected 15.872
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.837
Then I see Pidgin's account manager window 3.377
When I create my XMPP account 20.804
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:21:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
Given my XMPP friend goes online 0.000
When I start a conversation with my friend 0.000
And I say something to my friend 0.000
Then I receive a response from my friend 0.000
After features/support/hooks.rb:521 0.045
After features/support/hooks.rb:330 5.591

SCENARIO FAILED: 'Chatting with some friend over XMPP' (at time 05:59:37)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:59:37_Chatting_with_some_friend_over_XMPP.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:59:37_Chatting_with_some_friend_over_XMPP.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:59:37_Chatting_with_some_friend_over_XMPP.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/05:59:37_Chatting_with_some_friend_over_XMPP.journal

After features/support/hooks.rb:100 0.000
48.702
Scenario Chatting with some friend over XMPP in a multi-user chat
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:513 0.005
Steps
Given I have started Tails from DVD and logged in and the network is connected 13.279
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.898
Then I see Pidgin's account manager window 3.709
When I create my XMPP account 20.814
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:34:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
When I join some empty multi-user chat 0.000
And I clear the multi-user chat's scrollback 0.000
And my XMPP friend goes online and joins the multi-user chat 0.000
Then I can see that my friend joined the multi-user chat 0.000
And I say something to my friend in the multi-user chat 0.000
Then I receive a response from my friend in the multi-user chat 0.000
When I say https://gitlab.tails.boum.org/tails to my friend in the multi-user chat 0.000
Then I see the Tails GitLab URL 0.000
When I wait 10 seconds 0.000
And I click on the Tails GitLab URL 0.000
Then the Tor Browser loads the Tails GitLab 0.000
After features/support/hooks.rb:521 0.034
After features/support/hooks.rb:330 5.710

SCENARIO FAILED: 'Chatting with some friend over XMPP in a multi-user chat' (at time 06:00:32)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:00:32_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:00:32_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:00:32_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:00:32_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.journal

After features/support/hooks.rb:100 0.000
47.015
Scenario Connecting to the tails multi-user chat with my XMPP account
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:513 0.004
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.957
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.815
Then I see Pidgin's account manager window 3.450
And I create my XMPP account 20.792
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:54:in `And I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
And I can join the "tails" channel on "chat.disroot.org" 0.000
After features/support/hooks.rb:521 0.036
After features/support/hooks.rb:330 5.554

SCENARIO FAILED: 'Connecting to the tails multi-user chat with my XMPP account' (at time 06:01:24)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:01:24_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:01:24_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:01:24_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:01:24_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.journal

After features/support/hooks.rb:100 0.000
1:20.026
Scenario Using a persistent Pidgin configuration
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:513 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 12.880
And the network is plugged 0.186
And Tor is ready 19.152
And available upgrades have been checked 2.212
And all notifications have disappeared 7.556
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.886
Then I see Pidgin's account manager window 6.189
When I create my XMPP account 20.963
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:68:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
When I close Pidgin 0.000
And I take note of the configured Pidgin accounts 0.000
And I shutdown Tails and wait for the computer to power off 0.000
Given a computer 0.000
And I start Tails from USB drive "__internal" and I login with persistence enabled 0.000
And Pidgin has the expected persistent accounts configured 0.000
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 0.000
Then Pidgin automatically enables my XMPP account 0.000
And I join some empty multi-user chat 0.000
# Exercise Pidgin AppArmor profile with persistence enabled.
# This should really be in dedicated scenarios, but it would be
# too costly to set up the virtual USB drive with persistence more
# than once in this feature.
Given I start monitoring the AppArmor log of "/usr/bin/pidgin" 0.000
Then I cannot add a certificate from the "/home/amnesia/.gnupg" directory to Pidgin 0.000
And AppArmor has denied "/usr/bin/pidgin" from opening "/home/amnesia/.gnupg/test.crt" 0.000
When I close Pidgin's certificate import failure dialog 0.000
And I close Pidgin's certificate manager 0.000
Given I restart monitoring the AppArmor log of "/usr/bin/pidgin" 0.000
Then I cannot add a certificate from the "/live/persistence/TailsData_unlocked/gnupg" directory to Pidgin 0.000
And AppArmor has denied "/usr/bin/pidgin" from opening "/live/persistence/TailsData_unlocked/gnupg/test.crt" 0.000
When I close Pidgin's certificate import failure dialog 0.000
And I close Pidgin's certificate manager 0.000
Then I can add a certificate from the "/home/amnesia" directory to Pidgin 0.000
After features/support/hooks.rb:521 0.035
After features/support/hooks.rb:330 5.542

SCENARIO FAILED: 'Using a persistent Pidgin configuration' (at time 06:02:50)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:02:50_Using_a_persistent_Pidgin_configuration.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:02:50_Using_a_persistent_Pidgin_configuration.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:02:50_Using_a_persistent_Pidgin_configuration.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:02:50_Using_a_persistent_Pidgin_configuration.journal

After features/support/hooks.rb:100 0.000
Tags: @product
9.049
Scenario No unexpected error messages in the journal after booting from DVD
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails from DVD without network and logged in 8.943
Then there are no unexpected messages of priority "err" or higher in the journal 0.105
Unexpected error messages in the journal: [
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "575",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "575",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638741453970",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "325cf62e9a2145aea3ea693ecdb4c416",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=37c;b=1bd677df9e8442eea101b9705f92f045;m=d3f77a;t=6374983abc6a6;x=6324ed0b7a72b5fe",
    "__MONOTONIC_TIMESTAMP": "13891450",
    "__REALTIME_TIMESTAMP": "1749638741477030",
    "__SEQNUM": "892",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "575",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "575",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638741453994",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "325cf62e9a2145aea3ea693ecdb4c416",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=37e;b=1bd677df9e8442eea101b9705f92f045;m=d3f921;t=6374983abc84e;x=3789f639fbaee4d6",
    "__MONOTONIC_TIMESTAMP": "13891873",
    "__REALTIME_TIMESTAMP": "1749638741477454",
    "__SEQNUM": "894",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  },
  {
    "MESSAGE": "The group `powerdev' does not exist.",
    "SYSLOG_IDENTIFIER": "adduser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "2077",
    "SYSLOG_TIMESTAMP": "Jun 11 10:45:48 ",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "adduser",
    "_COMM": "adduser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "2077",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638748550161",
    "_SYSTEMD_CGROUP": "/system.slice/live-config.service",
    "_SYSTEMD_INVOCATION_ID": "b38fdb33c14c4f92b7d0b50a8b6a86e3",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "live-config.service",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=404;b=1bd677df9e8442eea101b9705f92f045;m=13fe4f3;t=637498417b420;x=703b941ffd22ad74",
    "__MONOTONIC_TIMESTAMP": "20964595",
    "__REALTIME_TIMESTAMP": "1749638748550176",
    "__SEQNUM": "1028",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "575",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "575",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638920984493",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "325cf62e9a2145aea3ea693ecdb4c416",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=8c1;b=1bd677df9e8442eea101b9705f92f045;m=6165a31;t=637498e5ed9e5;x=821474c3d90ec733",
    "__MONOTONIC_TIMESTAMP": "102128177",
    "__REALTIME_TIMESTAMP": "1749638920985061",
    "__SEQNUM": "2241",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "575",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "575",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638920984524",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "325cf62e9a2145aea3ea693ecdb4c416",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=8c3;b=1bd677df9e8442eea101b9705f92f045;m=6165bb2;t=637498e5edb66;x=216f3dc93c4c6386",
    "__MONOTONIC_TIMESTAMP": "102128562",
    "__REALTIME_TIMESTAMP": "1749638920985446",
    "__SEQNUM": "2243",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  },
  {
    "MESSAGE": "The user `amnesia' is not a member of group `sudo'.",
    "SYSLOG_IDENTIFIER": "deluser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "8569",
    "SYSLOG_TIMESTAMP": "Jun 11 10:48:46 ",
    "_AUDIT_LOGINUID": "1000",
    "_AUDIT_SESSION": "4",
    "_BOOT_ID": "1bd677df9e8442eea101b9705f92f045",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "/usr/bin/perl -T /usr/sbin/deluser amnesia sudo",
    "_COMM": "deluser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "8569",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749638926226258",
    "_SYSTEMD_CGROUP": "/user.slice/user-1000.slice/session-4.scope",
    "_SYSTEMD_INVOCATION_ID": "4ecd08aaccff48d78dad377d1f0d8e4b",
    "_SYSTEMD_OWNER_UID": "1000",
    "_SYSTEMD_SESSION": "4",
    "_SYSTEMD_SLICE": "user-1000.slice",
    "_SYSTEMD_UNIT": "session-4.scope",
    "_SYSTEMD_USER_SLICE": "-.slice",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=ae0ede818d034f829a70a0a9496ad8c2;i=1247;b=1bd677df9e8442eea101b9705f92f045;m=66653a8;t=637498eaed35b;x=ee34ba8cad8996db",
    "__MONOTONIC_TIMESTAMP": "107369384",
    "__REALTIME_TIMESTAMP": "1749638926226267",
    "__SEQNUM": "4679",
    "__SEQNUM_ID": "ae0ede818d034f829a70a0a9496ad8c2"
  }
].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/checks.rb:237:in `/^there are no unexpected messages of priority "err" or higher in the journal$/'
features/sane_defaults.feature:21:in `Then there are no unexpected messages of priority "err" or higher in the journal'
After features/support/hooks.rb:330 5.514

SCENARIO FAILED: 'No unexpected error messages in the journal after booting from DVD' (at time 06:08:07)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:07_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:07_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:07_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:07_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.journal

After features/support/hooks.rb:100 0.000
Tags: @product
12.906
Scenario No unexpected error messages in the journal after booting from USB drive
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 12.716
Then there are no unexpected messages of priority "err" or higher in the journal 0.190
Unexpected error messages in the journal: [
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "706",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "706",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749640929649483",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "78433482b4e245f2b10166c5ccfaeb19",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=368;b=fb851adb8914402e911aafcc0b3d979c;m=d9430a;t=6374a0618e7d3;x=8656d336d8382a5f",
    "__MONOTONIC_TIMESTAMP": "14238474",
    "__REALTIME_TIMESTAMP": "1749640929667027",
    "__SEQNUM": "872",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "706",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "706",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749640929649509",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "78433482b4e245f2b10166c5ccfaeb19",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=36a;b=fb851adb8914402e911aafcc0b3d979c;m=d9436a;t=6374a0618e832;x=fd6f55ebdbd23a40",
    "__MONOTONIC_TIMESTAMP": "14238570",
    "__REALTIME_TIMESTAMP": "1749640929667122",
    "__SEQNUM": "874",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  },
  {
    "MESSAGE": "The group `powerdev' does not exist.",
    "SYSLOG_IDENTIFIER": "adduser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "2409",
    "SYSLOG_TIMESTAMP": "Jun 11 11:22:18 ",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "adduser",
    "_COMM": "adduser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "2409",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749640938439971",
    "_SYSTEMD_CGROUP": "/system.slice/live-config.service",
    "_SYSTEMD_INVOCATION_ID": "f010e2c26321481d9858fc620da66f6e",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "live-config.service",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=40d;b=fb851adb8914402e911aafcc0b3d979c;m=15f209e;t=6374a069ec565;x=27cd8eb91b95d753",
    "__MONOTONIC_TIMESTAMP": "23011486",
    "__REALTIME_TIMESTAMP": "1749640938440037",
    "__SEQNUM": "1037",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "706",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "706",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749641067764280",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "78433482b4e245f2b10166c5ccfaeb19",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=e6d;b=fb851adb8914402e911aafcc0b3d979c;m=8e86c68;t=6374a0e541a50;x=117932d296aed445",
    "__MONOTONIC_TIMESTAMP": "149449832",
    "__REALTIME_TIMESTAMP": "1749641067764304",
    "__SEQNUM": "3693",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "706",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "706",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749641067764731",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "78433482b4e245f2b10166c5ccfaeb19",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=e6f;b=fb851adb8914402e911aafcc0b3d979c;m=8e86e17;t=6374a0e541c00;x=bd35995cd0dd828b",
    "__MONOTONIC_TIMESTAMP": "149450263",
    "__REALTIME_TIMESTAMP": "1749641067764736",
    "__SEQNUM": "3695",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  },
  {
    "MESSAGE": "The user `amnesia' is not a member of group `sudo'.",
    "SYSLOG_IDENTIFIER": "deluser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "12110",
    "SYSLOG_TIMESTAMP": "Jun 11 11:24:35 ",
    "_AUDIT_LOGINUID": "1000",
    "_AUDIT_SESSION": "4",
    "_BOOT_ID": "fb851adb8914402e911aafcc0b3d979c",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "/usr/bin/perl -T /usr/sbin/deluser amnesia sudo",
    "_COMM": "deluser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "12110",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749641075701894",
    "_SYSTEMD_CGROUP": "/user.slice/user-1000.slice/session-4.scope",
    "_SYSTEMD_INVOCATION_ID": "84dbe7480dd340f7a52277aba7b7d83f",
    "_SYSTEMD_OWNER_UID": "1000",
    "_SYSTEMD_SESSION": "4",
    "_SYSTEMD_SLICE": "user-1000.slice",
    "_SYSTEMD_UNIT": "session-4.scope",
    "_SYSTEMD_USER_SLICE": "-.slice",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=37141ae4efbc4281899f0904196da232;i=1857;b=fb851adb8914402e911aafcc0b3d979c;m=9618aa6;t=6374a0ecd388f;x=eff89d08df4b2b20",
    "__MONOTONIC_TIMESTAMP": "157387430",
    "__REALTIME_TIMESTAMP": "1749641075701903",
    "__SEQNUM": "6231",
    "__SEQNUM_ID": "37141ae4efbc4281899f0904196da232"
  }
].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/checks.rb:237:in `/^there are no unexpected messages of priority "err" or higher in the journal$/'
features/sane_defaults.feature:25:in `Then there are no unexpected messages of priority "err" or higher in the journal'
After features/support/hooks.rb:330 5.092

SCENARIO FAILED: 'No unexpected error messages in the journal after booting from USB drive' (at time 06:08:26)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:26_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:26_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:26_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:26_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.journal

After features/support/hooks.rb:100 0.000
Tags: @product
26.294
Scenario I can shutdown Tails via the system menu
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD and logged in and the network is connected 12.353
When I request a shutdown using the system menu 13.940
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/support/helpers/dogtail.rb:371:in `grabFocus'
./features/step_definitions/common_steps.rb:887:in `/^I request a (shutdown|reboot) using the system menu$/'
features/shutdown_applet.feature:6:in `When I request a shutdown using the system menu'
Then Tails eventually shuts down 0.000
After features/support/hooks.rb:330 5.297

SCENARIO FAILED: 'I can shutdown Tails via the system menu' (at time 06:08:57)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:57_I_can_shutdown_Tails_via_the_system_menu.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:57_I_can_shutdown_Tails_via_the_system_menu.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:57_I_can_shutdown_Tails_via_the_system_menu.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:08:57_I_can_shutdown_Tails_via_the_system_menu.journal

After features/support/hooks.rb:100 0.000
25.292
Scenario I can reboot Tails via the system menu
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.509
When I request a reboot using the system menu 13.783
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/support/helpers/dogtail.rb:371:in `grabFocus'
./features/step_definitions/common_steps.rb:887:in `/^I request a (shutdown|reboot) using the system menu$/'
features/shutdown_applet.feature:12:in `When I request a reboot using the system menu'
Then Tails eventually restarts 0.000
After features/support/hooks.rb:330 5.392

SCENARIO FAILED: 'I can reboot Tails via the system menu' (at time 06:09:28)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:09:28_I_can_reboot_Tails_via_the_system_menu.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:09:28_I_can_reboot_Tails_via_the_system_menu.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:09:28_I_can_reboot_Tails_via_the_system_menu.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:09:28_I_can_reboot_Tails_via_the_system_menu.journal

After features/support/hooks.rb:100 0.000
2:22.377
Scenario I can send emails, and receive emails over IMAP
Steps
When I enter my email credentials into the autoconfiguration wizard 2:22.377
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/thunderbird.rb:106:in `/^I enter my email credentials into the autoconfiguration wizard$/'
features/thunderbird.feature:19:in `When I enter my email credentials into the autoconfiguration wizard'
Then the autoconfiguration wizard's choice for the incoming server is secure IMAP 0.000
And the autoconfiguration wizard's choice for the outgoing server is secure SMTP 0.000
When I accept the autoconfiguration wizard's configuration 0.000
And I send an email to myself 0.000
And I fetch my email 0.000
Then I can find the email I sent to myself in my inbox 0.000
After features/support/hooks.rb:521 0.039
After features/support/hooks.rb:330 10.819

SCENARIO FAILED: 'I can send emails, and receive emails over IMAP' (at time 06:14:22)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:14:22_I_can_send_emails,_and_receive_emails_over_IMAP.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:14:22_I_can_send_emails,_and_receive_emails_over_IMAP.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:14:22_I_can_send_emails,_and_receive_emails_over_IMAP.journal

After features/support/hooks.rb:100 0.000
Tags: @product
37.382
Scenario The Tor Browser uses TBB's shared libraries
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.383
When I start the Tor Browser 22.925
Then the Tor Browser uses all expected TBB shared libraries 3.073
The
The loaded shared libraries for the firefox process are not the way we expect them.
Expected TBB libs that are absent: ["libfreeblpriv3.so", "libgkcodecs.so", "liblgpllibs.so", "libmozgtk.so", "libmozsandbox.so", "libmozsqlite3.so", "libmozwayland.so", "libnspr4.so", "libnss3.so", "libnssckbi.so", "libnssutil3.so", "libplc4.so", "libplds4.so", "libsmime3.so", "libsoftokn3.so", "libssl3.so", "libxul.so"]
Native libs that we don't want: [].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/browser.rb:259:in `xul_app_shared_lib_check'
./features/step_definitions/browser.rb:273:in `/^the (.*) uses all expected TBB shared libraries$/'
features/torified_browsing.feature:76:in `Then the Tor Browser uses all expected TBB shared libraries'
After features/support/hooks.rb:330 5.858

SCENARIO FAILED: 'The Tor Browser uses TBB's shared libraries' (at time 06:44:56)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:44:56_The_Tor_Browser_uses_TBB_s_shared_libraries.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:44:56_The_Tor_Browser_uses_TBB_s_shared_libraries.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:44:56_The_Tor_Browser_uses_TBB_s_shared_libraries.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:44:56_The_Tor_Browser_uses_TBB_s_shared_libraries.journal

After features/support/hooks.rb:100 0.000
Tags: @product
2:1.622
Scenario The Persistent directory is usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.008
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 10.648
And the network is plugged 0.073
And I successfully configure Tor 20.130
And available upgrades have been checked 1.167
And all notifications have disappeared 7.107
And there is a GNOME bookmark for the Persistent directory 24.297
When I start the Tor Browser 29.779
And I download some file in the Tor Browser to the Persistent directory 28.418
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:310:in `/^I download some file in the Tor Browser to the (.*) directory$/'
features/torified_browsing.feature:113:in `And I download some file in the Tor Browser to the Persistent directory'
Then the file is saved to the Persistent directory 0.000
When I open the address "https://tails.net/about" in the Tor Browser 0.000
Then "Tails - How Tails works" has loaded in the Tor Browser 0.000
And I can print the current page as "output.pdf" to the Persistent directory 0.000
After features/support/hooks.rb:330 10.711

SCENARIO FAILED: 'The Persistent directory is usable in Tor Browser' (at time 06:50:20)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:50:20_The_Persistent_directory_is_usable_in_Tor_Browser.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:50:20_The_Persistent_directory_is_usable_in_Tor_Browser.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:50:20_The_Persistent_directory_is_usable_in_Tor_Browser.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:13.960
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails from DVD without network and logged in 10.771
Then the live user's Documents directory exists 0.080
And there is a GNOME bookmark for the Documents directory 17.688
Then I start the Tor Browser in offline mode 23.999
And I can save the current page as "index.html" to the Documents GNOME bookmark 21.420
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:127:in `And I can save the current page as "index.html" to the Documents GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.929

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#1)' (at time 06:51:44)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:51:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:51:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:51:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:27.027
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.009
Steps
Given I have started Tails from DVD without network and logged in 22.144
Then the live user's Downloads directory exists 0.152
And there is a GNOME bookmark for the Downloads directory 17.505
Then I start the Tor Browser in offline mode 22.392
And I can save the current page as "index.html" to the Downloads GNOME bookmark 24.832
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:128:in `And I can save the current page as "index.html" to the Downloads GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.115

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#2)' (at time 06:53:21)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:53:21_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:53:21_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:53:21_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:53:21_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:21.795
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails from DVD without network and logged in 8.964
Then the live user's Music directory exists 0.111
And there is a GNOME bookmark for the Music directory 17.733
Then I start the Tor Browser in offline mode 24.475
And I can save the current page as "index.html" to the Music GNOME bookmark 30.509
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:129:in `And I can save the current page as "index.html" to the Music GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.748

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#3)' (at time 06:54:52)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:54:52_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:54:52_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:54:52_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:11.791
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD without network and logged in 8.521
Then the live user's Pictures directory exists 0.269
And there is a GNOME bookmark for the Pictures directory 18.262
Then I start the Tor Browser in offline mode 23.291
And I can save the current page as "index.html" to the Pictures GNOME bookmark 21.446
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:130:in `And I can save the current page as "index.html" to the Pictures GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 11.024

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#4)' (at time 06:56:14)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:56:14_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:56:14_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:56:14_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:15.664
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD without network and logged in 8.747
Then the live user's Videos directory exists 0.114
And there is a GNOME bookmark for the Videos directory 19.436
Then I start the Tor Browser in offline mode 33.159
And I can save the current page as "index.html" to the Videos GNOME bookmark 14.207
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:131:in `And I can save the current page as "index.html" to the Videos GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.809

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#5)' (at time 06:57:41)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:57:41_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:57:41_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:57:41_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
55.833
Scenario Persistent browser bookmarks
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 10.773
And all tps features are active 2.365
And all persistent filesystems have safe access rights 0.239
And all persistence configuration files have safe access rights 0.486
And all persistent directories have safe access rights 2.805
When I start the Tor Browser in offline mode 39.164
cannot find TorBrowserNewTabButton.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/browser.rb:479:in `/^the Tor Browser (?:has started|starts)$/'
./features/step_definitions/browser.rb:466:in `/^I start the Tor Browser( in offline mode)?$/'
features/torified_browsing.feature:139:in `When I start the Tor Browser in offline mode'
And I add a bookmark to eff.org in the Tor Browser 0.000
And I cold reboot the computer 0.000
And the computer reboots Tails 0.000
And I enable persistence 0.000
And I log in to a new session 0.000
And all notifications have disappeared 0.000
And I start the Tor Browser in offline mode 0.000
Then the Tor Browser has a bookmark to eff.org 0.000
After features/support/hooks.rb:330 9.168

SCENARIO FAILED: 'Persistent browser bookmarks' (at time 06:58:46)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:58:46_Persistent_browser_bookmarks.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:58:46_Persistent_browser_bookmarks.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:58:46_Persistent_browser_bookmarks.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/06:58:46_Persistent_browser_bookmarks.journal

After features/support/hooks.rb:100 0.000
51.261
Scenario Starting a second instance of the Unsafe Browser results in an error message being shown
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.008
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.569
When I successfully start the Unsafe Browser 17.728
# Wait for whatever facility the GNOME Activities Overview uses to
# learn about which applications are running to "settle". Without
# this sleep, it is confused and it's impossible to start a new
# instance (it will just switch to the one we already started).
And I wait 10 seconds 10.005

Slept for 10 seconds

And I try to start the Unsafe Browser 0.328
Then I see a warning about another instance already running 11.630
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'zenity' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/unsafe_browser.rb:76:in `new'
./features/step_definitions/unsafe_browser.rb:76:in `/^I see a warning about another instance already running$/'
features/unsafe_browser.feature:59:in `Then I see a warning about another instance already running'
After features/support/hooks.rb:330 10.418

SCENARIO FAILED: 'Starting a second instance of the Unsafe Browser results in an error message being shown' (at time 07:07:45)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:07:45_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:07:45_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:07:45_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:07:45_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.journal

After features/support/hooks.rb:100 0.000
21.064
Scenario Use Unlock VeraCrypt Volumes to unlock a USB drive that has a basic VeraCrypt volume
Steps
When I plug a USB drive containing a basic VeraCrypt volume 8.260
And I cancel the GNOME authentication prompt 2.518
And I unlock and mount this VeraCrypt volume with Unlock VeraCrypt Volumes 6.150
And I open this VeraCrypt volume in GNOME Files 3.555
Then I see the expected contents in this VeraCrypt volume 0.469
When I lock the currently opened VeraCrypt volume 0.110
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:18:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.542

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a USB drive that has a basic VeraCrypt volume' (at time 07:12:18)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:18_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:18_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:18_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:18_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.000
25.454
Scenario Use Unlock VeraCrypt Volumes to unlock a USB drive that has a hidden VeraCrypt volume
Steps
When I plug a USB drive containing a hidden VeraCrypt volume 11.469
And I cancel the GNOME authentication prompt 2.483
And I unlock and mount this VeraCrypt volume with Unlock VeraCrypt Volumes 7.622
And I open this VeraCrypt volume in GNOME Files 3.266
Then I see the expected contents in this VeraCrypt volume 0.504
When I lock the currently opened VeraCrypt volume 0.108
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:27:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.538

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a USB drive that has a hidden VeraCrypt volume' (at time 07:12:57)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:57_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:57_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:57_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:12:57_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.000
29.311
Scenario Use Unlock VeraCrypt Volumes to unlock a basic VeraCrypt file container with a PIM
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a PIM 16.212
And I unlock and mount this VeraCrypt file container with Unlock VeraCrypt Volumes 9.392
And I open this VeraCrypt volume in GNOME Files 3.158
Then I see the expected contents in this VeraCrypt volume 0.431
When I lock the currently opened VeraCrypt file container 0.115
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:35:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.576

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a basic VeraCrypt file container with a PIM' (at time 07:13:40)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:13:40_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:13:40_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:13:40_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:13:40_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.journal

After features/support/hooks.rb:100 0.000
40.019
Scenario Use Unlock VeraCrypt Volumes to unlock a hidden VeraCrypt file container
Steps
When I plug and mount a USB drive containing a hidden VeraCrypt file container 26.589
And I unlock and mount this VeraCrypt file container with Unlock VeraCrypt Volumes 9.622
And I open this VeraCrypt volume in GNOME Files 3.229
Then I see the expected contents in this VeraCrypt volume 0.490
When I lock the currently opened VeraCrypt file container 0.086
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:43:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.529

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a hidden VeraCrypt file container' (at time 07:14:35)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:14:35_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:14:35_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:14:35_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:14:35_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.journal

After features/support/hooks.rb:100 0.037
25.652
Scenario Use GNOME Disks to unlock a USB drive that has a basic VeraCrypt volume with a keyfile
Steps
When I plug a USB drive containing a basic VeraCrypt volume with a keyfile 8.630
And I cancel the GNOME authentication prompt 2.446
And I unlock and mount this VeraCrypt volume with GNOME Disks 10.733
And I open this VeraCrypt volume in GNOME Files 3.202
Then I see the expected contents in this VeraCrypt volume 0.486
When I lock the currently opened VeraCrypt volume 0.153
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:54:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.780

SCENARIO FAILED: 'Use GNOME Disks to unlock a USB drive that has a basic VeraCrypt volume with a keyfile' (at time 07:15:15)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:15_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:15_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:15_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:15_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.journal

After features/support/hooks.rb:100 0.000
26.559
Scenario Use GNOME Disks to unlock a USB drive that has a hidden VeraCrypt volume
Steps
When I plug a USB drive containing a hidden VeraCrypt volume 11.404
And I cancel the GNOME authentication prompt 2.236
And I unlock and mount this VeraCrypt volume with GNOME Disks 9.101
And I open this VeraCrypt volume in GNOME Files 3.219
Then I see the expected contents in this VeraCrypt volume 0.482
When I lock the currently opened VeraCrypt volume 0.116
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:63:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.495

SCENARIO FAILED: 'Use GNOME Disks to unlock a USB drive that has a hidden VeraCrypt volume' (at time 07:15:55)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:55_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:55_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:55_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:15:55_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.000
49.350
Scenario Use GNOME Disks to unlock a basic VeraCrypt file container with a keyfile
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a keyfile 23.265
And I unlock and mount this VeraCrypt file container with GNOME Disks 22.251
And I open this VeraCrypt volume in GNOME Files 3.265
Then I see the expected contents in this VeraCrypt volume 0.450
When I lock the currently opened VeraCrypt file container 0.117
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:71:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.644

SCENARIO FAILED: 'Use GNOME Disks to unlock a basic VeraCrypt file container with a keyfile' (at time 07:16:59)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:16:59_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:16:59_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:16:59_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:16:59_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.journal

After features/support/hooks.rb:100 0.014
49.742
Scenario Use GNOME Disks to unlock a hidden VeraCrypt file container
Steps
When I plug and mount a USB drive containing a hidden VeraCrypt file container 26.427
And I unlock and mount this VeraCrypt file container with GNOME Disks 19.472
And I open this VeraCrypt volume in GNOME Files 3.265
Then I see the expected contents in this VeraCrypt volume 0.458
When I lock the currently opened VeraCrypt file container 0.117
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:79:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 6.119

SCENARIO FAILED: 'Use GNOME Disks to unlock a hidden VeraCrypt file container' (at time 07:18:03)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:03_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:03_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:03_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:03_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.journal

After features/support/hooks.rb:100 0.015
38.285
Scenario Use GNOME Disks to unlock a basic VeraCrypt file container with a PIM
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a PIM 16.335
And I unlock and mount this VeraCrypt file container with GNOME Disks 18.074
And I open this VeraCrypt volume in GNOME Files 3.298
Then I see the expected contents in this VeraCrypt volume 0.451
When I lock the currently opened VeraCrypt file container 0.125
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:87:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.479

SCENARIO FAILED: 'Use GNOME Disks to unlock a basic VeraCrypt file container with a PIM' (at time 07:18:56)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:56_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:56_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:56_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/27/artifact/build-artifacts/07:18:56_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.journal

After features/support/hooks.rb:100 0.000