Project Number Date
test_Tails_ISO_feature-trixie 28 14 Jun 2025, 07:18

Failures Overview

The following summary displays scenarios that failed.

23.774
Scenario Erasure of memory freed by killed userspace processes
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.003
Steps
Given I have started Tails from DVD without network and logged in 10.627
And I prepare Tails for memory erasure tests 2.827
When I start a process allocating 128 MiB of memory with a known pattern 4.146
Then patterns cover at least 128 MiB in the guest's memory 6.173
99.610% of the expected size (128 MiB) has the pattern, but more than 100.000% was expected.
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/erase_memory.rb:186:in `/^patterns cover at least (\d+) MiB in the guest's memory$/'
features/erase_memory.feature:17:in `Then patterns cover at least 128 MiB in the guest's memory'
Output 1

Pattern coverage: 99.610% (127 MiB out of 128 MiB reference memory)

When I kill the allocating process 0.000
Then I find very few patterns in the guest's memory after 5 seconds 0.000
After features/support/hooks.rb:330 5.579

SCENARIO FAILED: 'Erasure of memory freed by killed userspace processes' (at time 00:53:00)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/00:53:00_Erasure_of_memory_freed_by_killed_userspace_processes.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/00:53:00_Erasure_of_memory_freed_by_killed_userspace_processes.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/00:53:00_Erasure_of_memory_freed_by_killed_userspace_processes.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/00:53:00_Erasure_of_memory_freed_by_killed_userspace_processes.journal

After features/support/hooks.rb:100 0.000
12:25.852
Scenario I set up Additional Software when installing a package without persistent partition and the package is installed next time I start Tails
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Before features/support/hooks.rb:507 0.005
Steps
Given I start Tails from a freshly installed USB drive with an administration password and the network is plugged and I login 1:35.525
And I update APT using apt 11.832
And I install "popularity-contest" using apt 28.545
Then I am proposed to add the "popularity-contest" package to my Additional Software 10.803
When I create a persistent storage and activate the Additional Software feature 29.065
Then Additional Software is correctly configured for package "popularity-contest" 2.838
When I shutdown Tails and wait for the computer to power off 16.469
And I start Tails from USB drive "__internal" with network unplugged 2:1.592
And I enable persistence 33.749
And I log in to a new session 29.703
And the Additional Software installation service has started 1:1.922
Then I am notified that the installation succeeded 5:3.805
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/common_steps.rb:595:in `/^I see the "(.+)" notification(?: after at most (\d+) seconds)?$/'
./features/step_definitions/additional_software_packages.rb:28:in `/^I am notified that the installation succeeded$/'
features/additional_software_packages.feature:39:in `Then I am notified that the installation succeeded'
And Additional Software is correctly configured for package "popularity-contest" 0.000
And the package "popularity-contest" is installed after Additional Software has been started 0.000
After features/support/hooks.rb:515 0.033
After features/support/hooks.rb:330 10.901

SCENARIO FAILED: 'I set up Additional Software when installing a package without persistent partition and the package is installed next time I start Tails' (at time 01:55:25)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/01:55:25_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/01:55:25_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/01:55:25_I_set_up_Additional_Software_when_installing_a_package_without_persistent_partition_and_the_package_is_installed_next_time_I_start_Tails.journal

After features/support/hooks.rb:100 0.012
25:41.467
Scenario The Additional Software dpkg hook notices when persistence is locked down while installing a package
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:507 0.005
Steps
Given a computer 0.175
And I start Tails from USB drive "__internal" and I login with an administration password 3:17.518
And I update APT using apt 17.557
When I install "makepp" using apt 22:6.216
Tor
Tor operation failed (despite retrying 10 times) with
Timeout::Error: try_for() timeout expired
Last ignored exception was: ExecutionFailedInVM: Command failed: dpkg -s 'makepp' 2>/dev/null | grep -qs '^Status:.*installed$'
error code: 1
stdout: 
stderr: .
<false> is not true.
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/apt.rb:69:in `block in wait_for_package_installation'
./features/support/helpers/misc_helpers.rb:78:in `block (2 levels) in try_for'
./features/support/helpers/misc_helpers.rb:66:in `loop'
./features/support/helpers/misc_helpers.rb:66:in `block in try_for'
./features/support/helpers/misc_helpers.rb:65:in `try_for'
./features/step_definitions/apt.rb:68:in `wait_for_package_installation'
./features/step_definitions/apt.rb:86:in `block (3 levels) in <top (required)>'
./features/step_definitions/apt.rb:82:in `block (2 levels) in <top (required)>'
./features/support/helpers/misc_helpers.rb:211:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/apt.rb:81:in `block in <top (required)>' (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/apt.rb:81:in `/^I install "(.+)" using apt$/'
features/additional_software_packages.feature:48:in `When I install "makepp" using apt'
Then the Additional Software dpkg hook has been run for package "makepp" and notices the persistence is locked 0.000
And the package "makepp" is installed 0.000
After features/support/hooks.rb:515 0.036
After features/support/hooks.rb:330 5.396

SCENARIO FAILED: 'The Additional Software dpkg hook notices when persistence is locked down while installing a package' (at time 02:21:17)

Exception thrown while trying to save /tmp/TailsToaster/artifact.file_content_run_live-additional-software_log: RemoteShell::ServerFailure: FileNotFoundError: [Errno 2] No such file or directory: '/run/live-additional-software/log'

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.file_content_var_log_boot.log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:21:17_The_Additional_Software_dpkg_hook_notices_when_persistence_is_locked_down_while_installing_a_package.journal

After features/support/hooks.rb:100 0.040
10:25.398
Scenario My Additional Software list is configurable through a GUI or through notifications when I install or remove packages with APT or Synaptic
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:507 0.004
Steps
Given a computer 0.189
And I start Tails from USB drive "__internal" and I login with persistence enabled and an administration password 4:39.984
When I uninstall "popularity-contest" using apt 25.232
And I accept removing "popularity-contest" from Additional Software 3.121
Then "popularity-contest" is not in the list of Additional Software 0.958
When I start Synaptic 15.366
And I update APT using Synaptic 22.403
And I install "cowsay" using Synaptic 4:38.141
Tor
Tor operation failed (despite retrying 10 times) with
Dogtail::Failure: The Dogtail init script raised: SearchError: child of [application | synaptic]: 'Find' dialog
STDOUT:

STDERR:

 (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/apt.rb:180:in `/^I install "(.+)" using Synaptic$/'
features/additional_software_packages.feature:61:in `And I install "cowsay" using Synaptic'
And I accept adding "cowsay" to Additional Software 0.000
Then Additional Software is correctly configured for package "cowsay" 0.000
When I uninstall "cowsay" using apt 0.000
And I refuse removing "cowsay" from Additional Software 0.000
Then Additional Software is correctly configured for package "cowsay" 0.000
When I start "Additional Software" via GNOME Activities Overview 0.000
And I remove "cowsay" from the list of Additional Software using Additional Software GUI 0.000
Then "cowsay" is not in the list of Additional Software 0.000
When I install "cowsay" using apt 0.000
And I refuse adding "cowsay" to Additional Software 0.000
Then "cowsay" is not in the list of Additional Software 0.000
After features/support/hooks.rb:515 0.047
After features/support/hooks.rb:330 6.165

SCENARIO FAILED: 'My Additional Software list is configurable through a GUI or through notifications when I install or remove packages with APT or Synaptic' (at time 02:31:48)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.file_content_var_log_boot.log

Content of /run/live-additional-software/log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.file_content_run_live-additional-software_log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:31:48_My_Additional_Software_list_is_configurable_through_a_GUI_or_through_notifications_when_I_install_or_remove_packages_with_APT_or_Synaptic.journal

After features/support/hooks.rb:100 0.012
3:17.265
Scenario I am notified when Additional Software fails to install a package
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:507 0.005
Steps
Given a computer 0.281
And I start Tails from USB drive "__internal" with network unplugged 1:52.050
And I enable persistence 18.477
And I remove the "cowsay" deb files from the APT cache 0.063
# Prevent the "Warning: virtual machine detected!" notification from racing
# with the one we'll be interacting with below.
And I disable the tails-virt-notify-user.service user unit 0.101
And I log in to a new session 39.348
Then I see the "The installation of your additional software failed" notification after at most 300 seconds 15.871
And I can open the Additional Software log file from the notification 11.071
The
The Dogtail init script raised: SearchError: descendant of [notification | ]: child with name='Show Log' roleName='button'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/additional_software_packages.rb:39:in `click_gnome_shell_notification_button'
./features/step_definitions/additional_software_packages.rb:166:in `/^I can open the Additional Software log file from the notification$/'
features/additional_software_packages.feature:130:in `And I can open the Additional Software log file from the notification'
And the package "cowsay" is not installed 0.000
After features/support/hooks.rb:515 0.056
After features/support/hooks.rb:330 6.841

SCENARIO FAILED: 'I am notified when Additional Software fails to install a package' (at time 02:50:40)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.file_content_var_log_boot.log

Content of /run/live-additional-software/log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.file_content_run_live-additional-software_log

Output of ls -la --full-time /live/persistence/tailsdata_unlocked: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.cmd_output_persistent_volume

Output of ls -lar --full-time /var/cache/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.cmd_output_var_cache_apt

Output of ls -lar --full-time /var/lib/apt: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.cmd_output_var_lib_apt

Output of mount: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.cmd_output_mount

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:50:40_I_am_notified_when_Additional_Software_fails_to_install_a_package.journal

After features/support/hooks.rb:100 0.007
Tags: @product
2:48.655
Scenario Installing an old version of Tails to a pristine USB drive
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.009
Steps
Given a computer 0.212
And I create a 7200 MiB disk named "old" 0.088
And I plug USB drive "old" 1.069
And I write an old version of the Tails USB image to disk "old" 35.124
When I start Tails from USB drive "old" with network unplugged 2:12.159
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:25:in `When I start Tails from USB drive "old" with network unplugged'
Then the boot device has safe access rights 0.000
And Tails is running from USB drive "old" 0.000
And there is no persistence partition on USB drive "old" 0.000
And process "udev-watchdog" is running 0.000
And udev-watchdog is monitoring the correct device 0.000
And I unplug USB drive "old" 0.000
After features/support/hooks.rb:330 7.669

SCENARIO FAILED: 'Installing an old version of Tails to a pristine USB drive' (at time 02:53:36)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:53:36_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:53:36_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:53:36_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:53:36_Installing_an_old_version_of_Tails_to_a_pristine_USB_drive.journal

After features/support/hooks.rb:100 0.007
Tags: @product
1:25.792
Scenario Creating a persistent partition with the old Tails USB installation
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.003
Steps
Given a computer 0.182
And I start Tails from USB drive "old" with network unplugged and I login 1:25.610
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:36:in `And I start Tails from USB drive "old" with network unplugged and I login'
Then Tails is running from USB drive "old" 0.000
And I create a persistent partition 0.000
And I take note of which tps features are available 0.000
Then a Tails persistence partition exists on USB drive "old" 0.000
And I shutdown Tails and wait for the computer to power off 0.000
After features/support/hooks.rb:330 5.980

SCENARIO FAILED: 'Creating a persistent partition with the old Tails USB installation' (at time 02:55:09)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:55:09_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:55:09_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:55:09_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:55:09_Creating_a_persistent_partition_with_the_old_Tails_USB_installation.journal

After features/support/hooks.rb:100 0.009
Tags: @product
1:31.942
Scenario Writing files to a read/write-enabled persistent partition with the old Tails USB installation
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.008
Steps
Given a computer 0.202
And I start Tails from USB drive "old" with network unplugged and I login with persistence enabled 1:31.740
The
The Dogtail init script raised: SearchError: descendent of [application | gnome-shell]: child with roleName='notification'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/common_steps.rb:453:in `/^the computer (?:re)?boots Tails$/'
./features/step_definitions/common_steps.rb:276:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:46:in `And I start Tails from USB drive "old" with network unplugged and I login with persistence enabled'
Then Tails is running from USB drive "old" 0.000
And all tps features are active 0.000
When I write some files expected to persist 0.000
# Verify that our baseline for the next scenarios is sane
And all persistent filesystems have safe access rights 0.000
And all persistence configuration files have safe access rights 0.000
And all persistent directories from the old Tails version have safe access rights 0.000
And I take note of which tps features are available 0.000
And I shutdown Tails and wait for the computer to power off 0.000
# XXX: how does guestfs work vs snapshots?
Then only the expected files are present on the persistence partition on USB drive "old" 0.000
After features/support/hooks.rb:330 6.007

SCENARIO FAILED: 'Writing files to a read/write-enabled persistent partition with the old Tails USB installation' (at time 02:56:47)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:56:47_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:56:47_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:56:47_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/02:56:47_Writing_files_to_a_read_write-enabled_persistent_partition_with_the_old_Tails_USB_installation.journal

After features/support/hooks.rb:100 0.009
Tags: @product
2:9.058
Scenario Booting Tails from a USB drive upgraded from USB with persistence enabled
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.014
Steps
Given a computer 0.258
And I start Tails from USB drive "to_upgrade" with network unplugged and I login with persistence enabled 2:8.800
The
The Dogtail init script raised: SearchError: descendant of [application | Welcome to Tails!]: child with roleName='password text'
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:339:in `initialize'
./features/support/helpers/dogtail.rb:268:in `new'
./features/support/helpers/dogtail.rb:268:in `call_tree_api_method'
./features/support/helpers/dogtail.rb:276:in `child'
./features/step_definitions/usb.rb:671:in `/^I try to enable persistence( with the changed passphrase)?$/'
./features/step_definitions/usb.rb:703:in `/^I enable persistence( with the changed passphrase)?$/'
./features/step_definitions/common_steps.rb:283:in `/^I start Tails from (.+?) drive "(.+?)"( with network unplugged)?( and I login( with persistence enabled)?( with the changed persistence passphrase)?( (?:and|with) an administration password)?)?$/'
features/usb_upgrade.feature:73:in `And I start Tails from USB drive "to_upgrade" with network unplugged and I login with persistence enabled'
Then all tps features from the old Tails version are active 0.000
And Tails is running from USB drive "to_upgrade" 0.000
And the boot device has safe access rights 0.000
And the expected persistent files created with the old Tails version are present in the filesystem 0.000
And all persistent directories from the old Tails version have safe access rights 0.000
After features/support/hooks.rb:330 5.659

SCENARIO FAILED: 'Booting Tails from a USB drive upgraded from USB with persistence enabled' (at time 03:02:01)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:02:01_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:02:01_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:02:01_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:02:01_Booting_Tails_from_a_USB_drive_upgraded_from_USB_with_persistence_enabled.journal

After features/support/hooks.rb:100 0.023
22:53.795
Scenario Upgrading an initial Tails installation with an incremental upgrade
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.013
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 13.635
And no SquashFS delta is installed 0.555
And Tails is fooled to think that version 6.0~testoverlayfs was initially installed 0.420
And Tails is fooled to think it is running version 6.0~testoverlayfs 0.069
And the file system changes introduced in version 6.2~testoverlayfs are not present 0.328
And the file system changes introduced in version 6.3~testoverlayfs are not present 0.597
When the network is plugged 0.081
And Tor is ready 25.033
Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs 22:13.072
Tor
Tor operation failed (despite retrying 10 times) with
FindFailed: can not find any of the patterns ["TailsUpgraderUpgradeTo6.2~testoverlayfs.png", "TailsUpgraderFailure.png"] on the screen (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/usb.rb:1301:in `/^I am proposed to install an incremental upgrade to version (.+)$/'
features/usb_upgrade.feature:90:in `Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs'
And I can successfully install the incremental upgrade to version 6.2~testoverlayfs 0.000
Given I shutdown Tails and wait for the computer to power off 0.000
When I start Tails from USB drive "__internal" with network unplugged and I login with persistence enabled 0.000
Then Tails is running version 6.2~testoverlayfs 0.000
And all tps features are active 0.000
And the file system changes introduced in version 6.2~testoverlayfs are present 0.000
And only the 6.2~testoverlayfs SquashFS delta is installed 0.000
# Our IUK sets a release date that can make Tor bootstrapping impossible
Given Tails system time is magically synchronized 0.000
# We'll really install Tails_amd64_6.0~testoverlayfs_to_6.3~testoverlayfs.iuk
# but we need some way to force upgrading a second time in a row
# even if only the initially installed version is considered
And Tails is fooled to think that version 6.1~testoverlayfs was initially installed 0.000
When the network is plugged 0.000
And Tor is ready 0.000
Then I am proposed to install an incremental upgrade to version 6.3~testoverlayfs 0.000
And I can successfully install the incremental upgrade to version 6.3~testoverlayfs 0.000
Given I shutdown Tails and wait for the computer to power off 0.000
When I start Tails from USB drive "__internal" with network unplugged and I login with persistence enabled 0.000
Then Tails is running version 6.3~testoverlayfs 0.000
And all tps features are active 0.000
And the file system changes introduced in version 6.3~testoverlayfs are present 0.000
And only the 6.3~testoverlayfs SquashFS delta is installed 0.000
# Regression test for #17425 (i.e. the Upgrader would propose
# upgrading to the version that's already running)
Given Tails system time is magically synchronized 0.000
And Tails is fooled to think that version 6.1~testoverlayfs was initially installed 0.000
When the network is plugged 0.000
And Tor is ready 0.000
Then the Upgrader considers the system as up-to-date 0.000
# Regression test on #8158 (i.e. the IUK's filesystem is not part of the Unsafe Browser's chroot)
And I successfully start the Unsafe Browser 0.000
And the file system changes introduced in version 6.3~testoverlayfs are present in the Unsafe Browser's chroot 0.000
After features/support/hooks.rb:330 5.790

SCENARIO FAILED: 'Upgrading an initial Tails installation with an incremental upgrade' (at time 03:25:01)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:25:01_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:25:01_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:25:01_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:25:01_Upgrading_an_initial_Tails_installation_with_an_incremental_upgrade.journal

After features/support/hooks.rb:100 0.000
22:44.982
Scenario Upgrading a Tails whose signing key is outdated
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 11.786
And Tails is fooled to think that version 6.0~testoverlayfs was initially installed 0.770
And Tails is fooled to think it is running version 6.0~testoverlayfs 0.118
And the signing key used by the Upgrader is outdated 0.098
But a current signing key is available on our website 0.000
When the network is plugged 0.077
And Tor is ready 20.310
Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs 22:11.820
Tor
Tor operation failed (despite retrying 10 times) with
FindFailed: can not find any of the patterns ["TailsUpgraderUpgradeTo6.2~testoverlayfs.png", "TailsUpgraderFailure.png"] on the screen (MaxRetriesFailure)
./features/support/helpers/misc_helpers.rb:226:in `rescue in block in retry_action'
./features/support/helpers/misc_helpers.rb:209:in `block in retry_action'
./features/support/helpers/misc_helpers.rb:208:in `loop'
./features/support/helpers/misc_helpers.rb:208:in `retry_action'
./features/support/helpers/misc_helpers.rb:196:in `retry_tor'
./features/step_definitions/usb.rb:1301:in `/^I am proposed to install an incremental upgrade to version (.+)$/'
features/usb_upgrade.feature:134:in `Then I am proposed to install an incremental upgrade to version 6.2~testoverlayfs'
After features/support/hooks.rb:330 5.595

SCENARIO FAILED: 'Upgrading a Tails whose signing key is outdated' (at time 03:47:52)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:47:52_Upgrading_a_Tails_whose_signing_key_is_outdated.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:47:52_Upgrading_a_Tails_whose_signing_key_is_outdated.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:47:52_Upgrading_a_Tails_whose_signing_key_is_outdated.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/03:47:52_Upgrading_a_Tails_whose_signing_key_is_outdated.journal

After features/support/hooks.rb:100 0.000
Tags: @product
2:51.531
Scenario Outline GPT backup corruption with a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition and stopped at Tails Greeter's login screen 11.809
And I corrupt the boot device's GPT backup header 0.418
And I power off the computer 0.534
When I start the computer 1.232
Then the computer boots Tails 1:51.529
When I log in to a new session 33.634
And all notifications have disappeared 12.113
Then I am recommended to migrate to a new USB stick due to partitioning errors 0.258
<nil> was expected to not be nil. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:84:in `/^I am recommended to migrate to a new USB stick due to partitioning errors$/'
features/hardware_failure.feature:45:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
features/hardware_failure.feature:41:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.387

SCENARIO FAILED: 'GPT backup corruption with a persistent partition, Examples (#1)' (at time 04:16:23)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:16:23_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:16:23_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:16:23_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:16:23_GPT_backup_corruption_with_a_persistent_partition,_Examples___1_.journal

After features/support/hooks.rb:100 0.008
Tags: @product
2:47.960
Scenario Outline GPT backup corruption with a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Steps
Given I have started Tails without network from a USB drive with a persistent partition and stopped at Tails Greeter's login screen 9.968
And I corrupt the boot device's GPT backup partition table 0.439
And I power off the computer 0.362
When I start the computer 1.204
Then the computer boots Tails 1:50.705
When I log in to a new session 31.139
And all notifications have disappeared 13.861
Then I am recommended to migrate to a new USB stick due to partitioning errors 0.278
<nil> was expected to not be nil. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:84:in `/^I am recommended to migrate to a new USB stick due to partitioning errors$/'
features/hardware_failure.feature:46:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
features/hardware_failure.feature:41:in `Then I am recommended to migrate to a new USB stick due to partitioning errors'
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.404

SCENARIO FAILED: 'GPT backup corruption with a persistent partition, Examples (#2)' (at time 04:19:18)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:19:18_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:19:18_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:19:18_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:19:18_GPT_backup_corruption_with_a_persistent_partition,_Examples___2_.journal

After features/support/hooks.rb:100 0.008
Tags: @product
3:55.114
Scenario GPT backup corruption without a persistent partition
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given a computer 0.177
And I set Tails to boot with options "test_gpt_corruption=gpt_backup,gpt_backup_table" 0.000
And I temporarily create a 7200 MiB disk named "temp" 0.042
And I plug USB drive "temp" 1.047
And I write the Tails USB image to disk "temp" 33.260
When I start Tails from USB drive "temp" with network unplugged 2:36.007
Then Tails is running from USB drive "temp" 0.401
And the Greeter forbids creating a persistent partition 0.081
When I log in to a new session 26.027
And all notifications have disappeared 17.851
Then I am recommended to reinstall Tails due to partitioning errors 0.215
<"OK"> was expected to include
<"Creation of Persistent Storage has been disabled">. (Test::Unit::AssertionFailedError)
./features/step_definitions/hardware.rb:94:in `/^I am recommended to reinstall Tails due to partitioning errors$/'
features/hardware_failure.feature:59:in `Then I am recommended to reinstall Tails due to partitioning errors'
# We are gonna verify the dialog again so we need to clean up the
# first instance.
And I close the "zenity" window 0.000
And I am told that Persistent Storage cannot be created 0.000
And Tails detected partitioning error partitioning-corruption 0.000
After features/support/hooks.rb:330 6.605

SCENARIO FAILED: 'GPT backup corruption without a persistent partition' (at time 04:23:19)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:23:19_GPT_backup_corruption_without_a_persistent_partition.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:23:19_GPT_backup_corruption_without_a_persistent_partition.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:23:19_GPT_backup_corruption_without_a_persistent_partition.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:23:19_GPT_backup_corruption_without_a_persistent_partition.journal

After features/support/hooks.rb:100 0.131
Tags: @product
9.258
Scenario The included APT repository keys are up-to-date
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD without network and logged in 8.907
Then the keys trusted by APT are valid for the next 3 months 0.350
Command failed: apt-key adv --batch --with-colons --fingerprint --list-key
error code: 127
stdout: 
stderr: /bin/sh: 1: apt-key: not found
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/checks.rb:19:in `assert_all_keys_are_valid_for_n_months'
./features/step_definitions/checks.rb:8:in `/^the keys trusted by APT are valid for the next (\d+) months$/'
features/keys.feature:11:in `Then the keys trusted by APT are valid for the next 3 months'
After features/support/hooks.rb:330 5.412

SCENARIO FAILED: 'The included APT repository keys are up-to-date' (at time 04:33:48)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:33:48_The_included_APT_repository_keys_are_up-to-date.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:33:48_The_included_APT_repository_keys_are_up-to-date.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:33:48_The_included_APT_repository_keys_are_up-to-date.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:33:48_The_included_APT_repository_keys_are_up-to-date.journal

After features/support/hooks.rb:100 0.000
Feature: Localization
Tags: @product
2:7.483
Scenario Outline Do not localize the XDG User Dirs to be able to use those dirs in Tor Browser (#19255)
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD without network and stopped at Tails Greeter's login screen 8.538
And I log in to a new session in German (de) 46.957
Then the live user's Music directory exists 0.503
And there is a GNOME bookmark for the Music directory 1:11.484
try_for() timeout expired
Last ignored exception was: Test::Unit::AssertionFailedError: <Dogtail::Failure> exception was expected but none was thrown.
./features/step_definitions/common_steps.rb:1198:in `block (2 levels) in <top (required)>'
./features/support/helpers/misc_helpers.rb:78:in `block (2 levels) in try_for'
./features/support/helpers/misc_helpers.rb:66:in `loop'
./features/support/helpers/misc_helpers.rb:66:in `block in try_for'
./features/support/helpers/misc_helpers.rb:65:in `try_for'
./features/step_definitions/common_steps.rb:1196:in `block in <top (required)>'
./features/step_definitions/common_steps.rb:1221:in `block in <top (required)>' (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/common_steps.rb:1196:in `/^I close the "([^"]+)" window via AltF4$/'
./features/step_definitions/common_steps.rb:1221:in `/^there is a GNOME bookmark for the (.*) directory$/'
features/localization.feature:16:in `And there is a GNOME bookmark for the Music directory'
features/localization.feature:11:in `And there is a GNOME bookmark for the <dir> directory'
After features/support/hooks.rb:330 6.458

SCENARIO FAILED: 'Do not localize the XDG User Dirs to be able to use those dirs in Tor Browser (#19255), Examples (#3)' (at time 04:38:46)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:38:46_Do_not_localize_the_XDG_User_Dirs_to_be_able_to_use_those_dirs_in_Tor_Browser___19255_,_Examples___3_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:38:46_Do_not_localize_the_XDG_User_Dirs_to_be_able_to_use_those_dirs_in_Tor_Browser___19255_,_Examples___3_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:38:46_Do_not_localize_the_XDG_User_Dirs_to_be_able_to_use_those_dirs_in_Tor_Browser___19255_,_Examples___3_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/04:38:46_Do_not_localize_the_XDG_User_Dirs_to_be_able_to_use_those_dirs_in_Tor_Browser___19255_,_Examples___3_.journal

After features/support/hooks.rb:100 0.000
Feature: Localization
2:15.317
Scenario Outline Tails is localized for every tier-1 language
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD without network and stopped at Tails Greeter's login screen 8.289
When I log in to a new session in Russian (ru) 52.485
Then the keyboard layout is set to "ru" 0.130
When the network is plugged 0.084
And Tor is ready 19.944
Then I successfully start the Unsafe Browser 16.442
And I kill the Unsafe Browser 7.276
When I enable the screen keyboard 0.222
Then the screen keyboard works in Tor Browser 30.441
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/common_steps.rb:1040:in `launch_app'
./features/step_definitions/common_steps.rb:1095:in `launch_tor_browser'
./features/step_definitions/browser.rb:456:in `/^I start the Tor Browser( in offline mode)?$/'
./features/step_definitions/browser.rb:422:in `/^the screen keyboard works in Tor Browser$/'
features/localization.feature:53:in `Then the screen keyboard works in Tor Browser'
features/localization.feature:30:in `Then the screen keyboard works in Tor Browser'
And DuckDuckGo is the default search engine 0.000
And I kill the Tor Browser 0.000
And the screen keyboard works in Thunderbird 0.000
And the layout of the screen keyboard is set to "ru" 0.000
After features/support/hooks.rb:330 10.989

SCENARIO FAILED: 'Tails is localized for every tier-1 language, Examples (#11)' (at time 05:16:50)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/05:16:50_Tails_is_localized_for_every_tier-1_language,_Examples___11_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/05:16:50_Tails_is_localized_for_every_tier-1_language,_Examples___11_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/05:16:50_Tails_is_localized_for_every_tier-1_language,_Examples___11_.journal

After features/support/hooks.rb:100 0.000
52.195
Background
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails without network from a USB drive without a persistent partition and logged in 11.243
And I create a persistent partition with the default settings 25.766
Then the "TorConfiguration" tps feature is not enabled 2.673
When the network is plugged 0.078
And the Tor Connection Assistant autostarts 7.063
And I configure some persistent obfs4 bridges in the Tor Connection Assistant 5.369
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/step_definitions/tor.rb:828:in `click_connect_to_tor'
./features/step_definitions/tor.rb:637:in `block (2 levels) in <top (required)>'
./features/step_definitions/tor.rb:499:in `tca_configure'
./features/step_definitions/tor.rb:628:in `/^I configure (?:some|the) (persistent )?(\w+) bridges (from a QR code )?in the Tor Connection Assistant(?: in (easy|hide) mode)?( without connecting|)$/'
features/persistent_tor_bridges.feature:13:in `And I configure some persistent obfs4 bridges in the Tor Connection Assistant'
And I wait until Tor is ready 0.000
Then tca.conf includes the configured bridges 0.000
And the "TorConfiguration" tps feature is enabled and active 0.000
When I cold reboot the computer 0.000
And the computer reboots Tails 0.000
And I enable persistence 0.000
And I capture all network traffic 0.000
And I log in to a new session 0.000
Then tca.conf includes the configured bridges 0.000
When the network is plugged 0.000
And the Tor Connection Assistant autostarts 0.000
Tags: @product
0.000
Scenario Disabling persistence of Tor bridges
Steps
When I choose to connect to Tor automatically 0.000
And I accept Tor Connection's offer to use my persistent bridges 0.000
And I disable saving bridges to Persistent Storage 0.000
And I configure a direct connection in the Tor Connection Assistant 0.000
Then I wait until Tor is ready 0.000
And tca.conf includes no bridge 0.000
And the "TorConfiguration" tps feature is not enabled and not active 0.000
And all Internet traffic has only flowed through Tor or connectivity check service 0.000
After features/support/hooks.rb:330 5.429

SCENARIO FAILED: 'Disabling persistence of Tor bridges' (at time 06:38:33)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:38:33_Disabling_persistence_of_Tor_bridges.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:38:33_Disabling_persistence_of_Tor_bridges.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:38:33_Disabling_persistence_of_Tor_bridges.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:38:33_Disabling_persistence_of_Tor_bridges.journal

After features/support/hooks.rb:100 0.000
52.413
Scenario Chatting with some friend over XMPP
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:507 0.004
Steps
Given I have started Tails from DVD and logged in and the network is connected 16.350
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 11.350
Then I see Pidgin's account manager window 3.812
When I create my XMPP account 20.899
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:21:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
Given my XMPP friend goes online 0.000
When I start a conversation with my friend 0.000
And I say something to my friend 0.000
Then I receive a response from my friend 0.000
After features/support/hooks.rb:515 0.037
After features/support/hooks.rb:330 5.937

SCENARIO FAILED: 'Chatting with some friend over XMPP' (at time 06:40:01)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:01_Chatting_with_some_friend_over_XMPP.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:01_Chatting_with_some_friend_over_XMPP.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:01_Chatting_with_some_friend_over_XMPP.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:01_Chatting_with_some_friend_over_XMPP.journal

After features/support/hooks.rb:100 0.000
48.123
Scenario Chatting with some friend over XMPP in a multi-user chat
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.009
Before features/support/hooks.rb:507 0.011
Steps
Given I have started Tails from DVD and logged in and the network is connected 13.029
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.782
Then I see Pidgin's account manager window 3.515
When I create my XMPP account 20.796
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:34:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
When I join some empty multi-user chat 0.000
And I clear the multi-user chat's scrollback 0.000
And my XMPP friend goes online and joins the multi-user chat 0.000
Then I can see that my friend joined the multi-user chat 0.000
And I say something to my friend in the multi-user chat 0.000
Then I receive a response from my friend in the multi-user chat 0.000
When I say https://gitlab.tails.boum.org/tails to my friend in the multi-user chat 0.000
Then I see the Tails GitLab URL 0.000
When I wait 10 seconds 0.000
And I click on the Tails GitLab URL 0.000
Then the Tor Browser loads the Tails GitLab 0.000
After features/support/hooks.rb:515 0.031
After features/support/hooks.rb:330 7.896

SCENARIO FAILED: 'Chatting with some friend over XMPP in a multi-user chat' (at time 06:40:55)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:55_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:55_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:55_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:40:55_Chatting_with_some_friend_over_XMPP_in_a_multi-user_chat.journal

After features/support/hooks.rb:100 0.000
48.396
Scenario Connecting to the tails multi-user chat with my XMPP account
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.011
Before features/support/hooks.rb:507 0.009
Steps
Given I have started Tails from DVD and logged in and the network is connected 13.226
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 10.890
Then I see Pidgin's account manager window 3.332
And I create my XMPP account 20.945
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:54:in `And I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
And I can join the "tails" channel on "chat.disroot.org" 0.000
After features/support/hooks.rb:515 0.036
After features/support/hooks.rb:330 6.457

SCENARIO FAILED: 'Connecting to the tails multi-user chat with my XMPP account' (at time 06:41:52)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:41:52_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:41:52_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:41:52_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:41:52_Connecting_to_the_tails_multi-user_chat_with_my_XMPP_account.journal

After features/support/hooks.rb:100 0.000
1:22.114
Scenario Using a persistent Pidgin configuration
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.004
Before features/support/hooks.rb:507 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 12.385
And the network is plugged 0.165
And Tor is ready 21.721
And available upgrades have been checked 3.300
And all notifications have disappeared 7.254
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 11.645
Then I see Pidgin's account manager window 4.864
When I create my XMPP account 20.777
cannot find PidginAddAccountWindow.png on the screen (FindFailed)
./features/support/helpers/screen.rb:181:in `rescue in wait'
./features/support/helpers/screen.rb:165:in `wait'
./features/step_definitions/pidgin.rb:123:in `/^I create my XMPP account$/'
features/pidgin.feature:68:in `When I create my XMPP account'
And I close Pidgin's account manager window 0.000
Then Pidgin automatically enables my XMPP account 0.000
When I close Pidgin 0.000
And I take note of the configured Pidgin accounts 0.000
And I shutdown Tails and wait for the computer to power off 0.000
Given a computer 0.000
And I start Tails from USB drive "__internal" and I login with persistence enabled 0.000
And Pidgin has the expected persistent accounts configured 0.000
When I start "Pidgin Internet Messenger" via GNOME Activities Overview 0.000
Then Pidgin automatically enables my XMPP account 0.000
And I join some empty multi-user chat 0.000
# Exercise Pidgin AppArmor profile with persistence enabled.
# This should really be in dedicated scenarios, but it would be
# too costly to set up the virtual USB drive with persistence more
# than once in this feature.
Given I start monitoring the AppArmor log of "/usr/bin/pidgin" 0.000
Then I cannot add a certificate from the "/home/amnesia/.gnupg" directory to Pidgin 0.000
And AppArmor has denied "/usr/bin/pidgin" from opening "/home/amnesia/.gnupg/test.crt" 0.000
When I close Pidgin's certificate import failure dialog 0.000
And I close Pidgin's certificate manager 0.000
Given I restart monitoring the AppArmor log of "/usr/bin/pidgin" 0.000
Then I cannot add a certificate from the "/live/persistence/TailsData_unlocked/gnupg" directory to Pidgin 0.000
And AppArmor has denied "/usr/bin/pidgin" from opening "/live/persistence/TailsData_unlocked/gnupg/test.crt" 0.000
When I close Pidgin's certificate import failure dialog 0.000
And I close Pidgin's certificate manager 0.000
Then I can add a certificate from the "/home/amnesia" directory to Pidgin 0.000
After features/support/hooks.rb:515 0.039
After features/support/hooks.rb:330 6.070

SCENARIO FAILED: 'Using a persistent Pidgin configuration' (at time 06:43:20)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:43:20_Using_a_persistent_Pidgin_configuration.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:43:20_Using_a_persistent_Pidgin_configuration.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:43:20_Using_a_persistent_Pidgin_configuration.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:43:20_Using_a_persistent_Pidgin_configuration.journal

After features/support/hooks.rb:100 0.000
Tags: @product
9.157
Scenario No unexpected error messages in the journal after booting from DVD
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.014
Steps
Given I have started Tails from DVD without network and logged in 9.051
Then there are no unexpected messages of priority "err" or higher in the journal 0.105
Unexpected error messages in the journal: [
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "578",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "578",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857297763811",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "45c564c2c31e4ae79a3535a8789e2475",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=368;b=d36a10e696ef455e942dc1fb6fd3097e;m=120a797;t=6377c66a40e04;x=122d75ea40bf0321",
    "__MONOTONIC_TIMESTAMP": "18917271",
    "__REALTIME_TIMESTAMP": "1749857297763844",
    "__SEQNUM": "872",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "578",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "578",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857297764469",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "45c564c2c31e4ae79a3535a8789e2475",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=36a;b=d36a10e696ef455e942dc1fb6fd3097e;m=120aa0a;t=6377c66a41079;x=2924e3d2db73ace9",
    "__MONOTONIC_TIMESTAMP": "18917898",
    "__REALTIME_TIMESTAMP": "1749857297764473",
    "__SEQNUM": "874",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  },
  {
    "MESSAGE": "The group `powerdev' does not exist.",
    "SYSLOG_IDENTIFIER": "adduser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "1952",
    "SYSLOG_TIMESTAMP": "Jun 13 23:28:32 ",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "adduser",
    "_COMM": "adduser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "1952",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857312649235",
    "_SYSTEMD_CGROUP": "/system.slice/live-config.service",
    "_SYSTEMD_INVOCATION_ID": "c74fbfd2d2684a68aa8789e50fd8ab3c",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "live-config.service",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=406;b=d36a10e696ef455e942dc1fb6fd3097e;m=203c9e6;t=6377c67873055;x=bdcbbed995be7aba",
    "__MONOTONIC_TIMESTAMP": "33802726",
    "__REALTIME_TIMESTAMP": "1749857312649301",
    "__SEQNUM": "1030",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "578",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "578",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857575198165",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "45c564c2c31e4ae79a3535a8789e2475",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=8bf;b=d36a10e696ef455e942dc1fb6fd3097e;m=8b252e8;t=6377c772d5e38;x=1876b67804fe513b",
    "__MONOTONIC_TIMESTAMP": "145904360",
    "__REALTIME_TIMESTAMP": "1749857575198264",
    "__SEQNUM": "2239",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "578",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "578",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857575198189",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "45c564c2c31e4ae79a3535a8789e2475",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=8c1;b=d36a10e696ef455e942dc1fb6fd3097e;m=8b2544e;t=6377c772d5f9d;x=1a0e68e9b1a73ea7",
    "__MONOTONIC_TIMESTAMP": "145904718",
    "__REALTIME_TIMESTAMP": "1749857575198621",
    "__SEQNUM": "2241",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  },
  {
    "MESSAGE": "The user `amnesia' is not a member of group `sudo'.",
    "SYSLOG_IDENTIFIER": "deluser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "6824",
    "SYSLOG_TIMESTAMP": "Jun 13 23:33:02 ",
    "_AUDIT_LOGINUID": "1000",
    "_AUDIT_SESSION": "4",
    "_BOOT_ID": "d36a10e696ef455e942dc1fb6fd3097e",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "/usr/bin/perl -T /usr/sbin/deluser amnesia sudo",
    "_COMM": "deluser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "6824",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749857582326353",
    "_SYSTEMD_CGROUP": "/user.slice/user-1000.slice/session-4.scope",
    "_SYSTEMD_INVOCATION_ID": "efd21f7ea3fc4c3e87606fe24cf75bce",
    "_SYSTEMD_OWNER_UID": "1000",
    "_SYSTEMD_SESSION": "4",
    "_SYSTEMD_SLICE": "user-1000.slice",
    "_SYSTEMD_UNIT": "session-4.scope",
    "_SYSTEMD_USER_SLICE": "-.slice",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=935cf24e9cbc4b60bce12d8eb10fa0fd;i=1220;b=d36a10e696ef455e942dc1fb6fd3097e;m=91f179b;t=6377c779a22e8;x=5c15902b044c37ea",
    "__MONOTONIC_TIMESTAMP": "153032603",
    "__REALTIME_TIMESTAMP": "1749857582326504",
    "__SEQNUM": "4640",
    "__SEQNUM_ID": "935cf24e9cbc4b60bce12d8eb10fa0fd"
  }
].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/checks.rb:237:in `/^there are no unexpected messages of priority "err" or higher in the journal$/'
features/sane_defaults.feature:21:in `Then there are no unexpected messages of priority "err" or higher in the journal'
After features/support/hooks.rb:330 5.382

SCENARIO FAILED: 'No unexpected error messages in the journal after booting from DVD' (at time 06:48:42)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:48:42_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:48:42_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:48:42_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:48:42_No_unexpected_error_messages_in_the_journal_after_booting_from_DVD.journal

After features/support/hooks.rb:100 0.000
Tags: @product
31.049
Scenario No unexpected error messages in the journal after booting from USB drive
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 30.933
Then there are no unexpected messages of priority "err" or higher in the journal 0.116
Unexpected error messages in the journal: [
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "705",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "705",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860451871502",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "dc6eaf52d3634e0cadb807b991fb5400",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=365;b=e72440bb68f74433b28d5df25e045018;m=1126c67;t=6377d22a3eb1f;x=989bdbb47af38a02",
    "__MONOTONIC_TIMESTAMP": "17984615",
    "__REALTIME_TIMESTAMP": "1749860451871519",
    "__SEQNUM": "869",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "705",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "705",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860451871687",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "dc6eaf52d3634e0cadb807b991fb5400",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=367;b=e72440bb68f74433b28d5df25e045018;m=1126d12;t=6377d22a3ebcb;x=6e612ab8bfcc651b",
    "__MONOTONIC_TIMESTAMP": "17984786",
    "__REALTIME_TIMESTAMP": "1749860451871691",
    "__SEQNUM": "871",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  },
  {
    "MESSAGE": "The group `powerdev' does not exist.",
    "SYSLOG_IDENTIFIER": "adduser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "2631",
    "SYSLOG_TIMESTAMP": "Jun 14 00:21:02 ",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "2631",
    "_RUNTIME_SCOPE": "system",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860462797873",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=41b;b=e72440bb68f74433b28d5df25e045018;m=1b95077;t=6377d234acf2d;x=b7aa213ce255e81b",
    "__MONOTONIC_TIMESTAMP": "28921975",
    "__REALTIME_TIMESTAMP": "1749860462808877",
    "__SEQNUM": "1051",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "705",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "705",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860637023023",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "dc6eaf52d3634e0cadb807b991fb5400",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=e8c;b=e72440bb68f74433b28d5df25e045018;m=baf328b;t=6377d2dad1bc2;x=bf8cf757aec538db",
    "__MONOTONIC_TIMESTAMP": "196031115",
    "__REALTIME_TIMESTAMP": "1749860637023170",
    "__SEQNUM": "3724",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  },
  {
    "MESSAGE": "/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO=\"alsa_restore_std\" has no matching label, ignoring.",
    "SYSLOG_IDENTIFIER": "systemd-udevd",
    "CODE_FILE": "src/udev/udev-rules.c",
    "CODE_FUNC": "rule_resolve_goto",
    "CODE_LINE": "1376",
    "PRIORITY": "3",
    "SYSLOG_FACILITY": "3",
    "TID": "705",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_CAP_EFFECTIVE": "1f7fdffffff",
    "_CMDLINE": "/usr/lib/systemd/systemd-udevd",
    "_COMM": "systemd-udevd",
    "_EXE": "/usr/bin/udevadm",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "705",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860637023966",
    "_SYSTEMD_CGROUP": "/system.slice/systemd-udevd.service/udev",
    "_SYSTEMD_INVOCATION_ID": "dc6eaf52d3634e0cadb807b991fb5400",
    "_SYSTEMD_SLICE": "system.slice",
    "_SYSTEMD_UNIT": "systemd-udevd.service",
    "_TRANSPORT": "journal",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=e8e;b=e72440bb68f74433b28d5df25e045018;m=baf35ac;t=6377d2dad1ee3;x=f341516b00849554",
    "__MONOTONIC_TIMESTAMP": "196031916",
    "__REALTIME_TIMESTAMP": "1749860637023971",
    "__SEQNUM": "3726",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  },
  {
    "MESSAGE": "The user `amnesia' is not a member of group `sudo'.",
    "SYSLOG_IDENTIFIER": "deluser",
    "PRIORITY": "2",
    "SYSLOG_FACILITY": "1",
    "SYSLOG_PID": "12805",
    "SYSLOG_TIMESTAMP": "Jun 14 00:24:07 ",
    "_AUDIT_LOGINUID": "1000",
    "_AUDIT_SESSION": "4",
    "_BOOT_ID": "e72440bb68f74433b28d5df25e045018",
    "_CAP_EFFECTIVE": "1ffffffffff",
    "_CMDLINE": "/usr/bin/perl -T /usr/sbin/deluser amnesia sudo",
    "_COMM": "deluser",
    "_EXE": "/usr/bin/perl",
    "_GID": "0",
    "_HOSTNAME": "amnesia",
    "_MACHINE_ID": "203552d5819c41f3800e2c8ef2545404",
    "_PID": "12805",
    "_RUNTIME_SCOPE": "system",
    "_SELINUX_CONTEXT": "unconfined\n",
    "_SOURCE_REALTIME_TIMESTAMP": "1749860647235891",
    "_SYSTEMD_CGROUP": "/user.slice/user-1000.slice/session-4.scope",
    "_SYSTEMD_INVOCATION_ID": "464c302a7c124749bcb52b1d9a277e56",
    "_SYSTEMD_OWNER_UID": "1000",
    "_SYSTEMD_SESSION": "4",
    "_SYSTEMD_SLICE": "user-1000.slice",
    "_SYSTEMD_UNIT": "session-4.scope",
    "_SYSTEMD_USER_SLICE": "-.slice",
    "_TRANSPORT": "syslog",
    "_UID": "0",
    "__CURSOR": "s=cc1262ee89ce47bc8e5abbbe2d6b56e1;i=188e;b=e72440bb68f74433b28d5df25e045018;m=c4b082c;t=6377d2e48f163;x=848efd093822d932",
    "__MONOTONIC_TIMESTAMP": "206243884",
    "__REALTIME_TIMESTAMP": "1749860647235939",
    "__SEQNUM": "6286",
    "__SEQNUM_ID": "cc1262ee89ce47bc8e5abbbe2d6b56e1"
  }
].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/checks.rb:237:in `/^there are no unexpected messages of priority "err" or higher in the journal$/'
features/sane_defaults.feature:25:in `Then there are no unexpected messages of priority "err" or higher in the journal'
After features/support/hooks.rb:330 6.089

SCENARIO FAILED: 'No unexpected error messages in the journal after booting from USB drive' (at time 06:49:18)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:18_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:18_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:18_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:18_No_unexpected_error_messages_in_the_journal_after_booting_from_USB_drive.journal

After features/support/hooks.rb:100 0.000
Tags: @product
27.577
Scenario I can shutdown Tails via the system menu
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD and logged in and the network is connected 13.617
When I request a shutdown using the system menu 13.960
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/support/helpers/dogtail.rb:371:in `grabFocus'
./features/step_definitions/common_steps.rb:887:in `/^I request a (shutdown|reboot) using the system menu$/'
features/shutdown_applet.feature:6:in `When I request a shutdown using the system menu'
Then Tails eventually shuts down 0.000
After features/support/hooks.rb:330 5.585

SCENARIO FAILED: 'I can shutdown Tails via the system menu' (at time 06:49:52)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:52_I_can_shutdown_Tails_via_the_system_menu.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:52_I_can_shutdown_Tails_via_the_system_menu.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:52_I_can_shutdown_Tails_via_the_system_menu.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:49:52_I_can_shutdown_Tails_via_the_system_menu.journal

After features/support/hooks.rb:100 0.000
27.144
Scenario I can reboot Tails via the system menu
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Steps
Given I have started Tails from DVD and logged in and the network is connected 12.827
When I request a reboot using the system menu 14.316
try_for() timeout expired (Timeout::Error)
./features/support/helpers/misc_helpers.rb:145:in `rescue in try_for'
./features/support/helpers/misc_helpers.rb:51:in `try_for'
./features/support/helpers/dogtail.rb:371:in `grabFocus'
./features/step_definitions/common_steps.rb:887:in `/^I request a (shutdown|reboot) using the system menu$/'
features/shutdown_applet.feature:12:in `When I request a reboot using the system menu'
Then Tails eventually restarts 0.000
After features/support/hooks.rb:330 5.619

SCENARIO FAILED: 'I can reboot Tails via the system menu' (at time 06:50:25)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:50:25_I_can_reboot_Tails_via_the_system_menu.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:50:25_I_can_reboot_Tails_via_the_system_menu.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:50:25_I_can_reboot_Tails_via_the_system_menu.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/06:50:25_I_can_reboot_Tails_via_the_system_menu.journal

After features/support/hooks.rb:100 0.000
1:16.428
Scenario The Downloads directory is usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.005
Before features/support/hooks.rb:507 0.006
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.892
Then the live user's Downloads directory exists 0.082
And there is a GNOME bookmark for the Downloads directory 19.653
When I start the Tor Browser 27.826
And the Tor Browser loads the startup page 1.516
Then I can save the current page as "index.html" to the Downloads directory 15.456
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:24:in `Then I can save the current page as "index.html" to the Downloads directory'
And I can print the current page as "output.pdf" to the Downloads directory 0.000
After features/support/hooks.rb:515 0.053
After features/support/hooks.rb:330 8.767

SCENARIO FAILED: 'The Downloads directory is usable in Tor Browser' (at time 07:17:26)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:17:26_The_Downloads_directory_is_usable_in_Tor_Browser.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:17:26_The_Downloads_directory_is_usable_in_Tor_Browser.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:17:26_The_Downloads_directory_is_usable_in_Tor_Browser.journal

After features/support/hooks.rb:100 0.000
Tags: @product
43.550
Scenario The Tor Browser uses TBB's shared libraries
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails from DVD and logged in and the network is connected 11.732
When I start the Tor Browser 27.234
Then the Tor Browser uses all expected TBB shared libraries 4.583
The
The loaded shared libraries for the firefox process are not the way we expect them.
Expected TBB libs that are absent: ["libfreeblpriv3.so", "libgkcodecs.so", "liblgpllibs.so", "libmozgtk.so", "libmozsandbox.so", "libmozsqlite3.so", "libmozwayland.so", "libnspr4.so", "libnss3.so", "libnssckbi.so", "libnssutil3.so", "libplc4.so", "libplds4.so", "libsmime3.so", "libsoftokn3.so", "libssl3.so", "libxul.so"]
Native libs that we don't want: [].
<false> is not true. (Test::Unit::AssertionFailedError)
./features/step_definitions/browser.rb:259:in `xul_app_shared_lib_check'
./features/step_definitions/browser.rb:273:in `/^the (.*) uses all expected TBB shared libraries$/'
features/torified_browsing.feature:76:in `Then the Tor Browser uses all expected TBB shared libraries'
After features/support/hooks.rb:330 5.575

SCENARIO FAILED: 'The Tor Browser uses TBB's shared libraries' (at time 07:24:34)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:24:34_The_Tor_Browser_uses_TBB_s_shared_libraries.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:24:34_The_Tor_Browser_uses_TBB_s_shared_libraries.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:24:34_The_Tor_Browser_uses_TBB_s_shared_libraries.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:24:34_The_Tor_Browser_uses_TBB_s_shared_libraries.journal

After features/support/hooks.rb:100 0.000
Tags: @product
2:3.480
Scenario The Persistent directory is usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails without network from a USB drive with a persistent partition enabled and logged in 10.958
And the network is plugged 0.085
And I successfully configure Tor 21.096
And available upgrades have been checked 2.259
And all notifications have disappeared 7.102
And there is a GNOME bookmark for the Persistent directory 24.260
When I start the Tor Browser 24.322
And I download some file in the Tor Browser to the Persistent directory 33.394
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:310:in `/^I download some file in the Tor Browser to the (.*) directory$/'
features/torified_browsing.feature:113:in `And I download some file in the Tor Browser to the Persistent directory'
Then the file is saved to the Persistent directory 0.000
When I open the address "https://tails.net/about" in the Tor Browser 0.000
Then "Tails - How Tails works" has loaded in the Tor Browser 0.000
And I can print the current page as "output.pdf" to the Persistent directory 0.000
After features/support/hooks.rb:330 8.198

SCENARIO FAILED: 'The Persistent directory is usable in Tor Browser' (at time 07:30:05)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:30:05_The_Persistent_directory_is_usable_in_Tor_Browser.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:30:05_The_Persistent_directory_is_usable_in_Tor_Browser.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:30:05_The_Persistent_directory_is_usable_in_Tor_Browser.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:30:05_The_Persistent_directory_is_usable_in_Tor_Browser.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:15.241
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD without network and logged in 9.087
Then the live user's Documents directory exists 0.254
And there is a GNOME bookmark for the Documents directory 19.130
Then I start the Tor Browser in offline mode 29.155
And I can save the current page as "index.html" to the Documents GNOME bookmark 17.614
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:127:in `And I can save the current page as "index.html" to the Documents GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.977

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#1)' (at time 07:31:29)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:31:29_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:31:29_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:31:29_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___1_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:18.943
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.007
Steps
Given I have started Tails from DVD without network and logged in 9.413
Then the live user's Downloads directory exists 0.133
And there is a GNOME bookmark for the Downloads directory 19.218
Then I start the Tor Browser in offline mode 31.656
And I can save the current page as "index.html" to the Downloads GNOME bookmark 18.521
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:128:in `And I can save the current page as "index.html" to the Downloads GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 11.347

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#2)' (at time 07:32:58)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:32:58_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:32:58_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:32:58_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___2_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:10.493
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD without network and logged in 9.192
Then the live user's Music directory exists 0.098
And there is a GNOME bookmark for the Music directory 19.320
Then I start the Tor Browser in offline mode 21.957
And I can save the current page as "index.html" to the Music GNOME bookmark 19.924
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:129:in `And I can save the current page as "index.html" to the Music GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 10.435

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#3)' (at time 07:34:20)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:34:20_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:34:20_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:34:20_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___3_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:13.709
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD without network and logged in 9.113
Then the live user's Pictures directory exists 0.179
And there is a GNOME bookmark for the Pictures directory 19.198
Then I start the Tor Browser in offline mode 21.049
And I can save the current page as "index.html" to the Pictures GNOME bookmark 24.168
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:130:in `And I can save the current page as "index.html" to the Pictures GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.688

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#4)' (at time 07:35:44)

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:35:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:35:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:35:44_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___4_.journal

After features/support/hooks.rb:100 0.000
Tags: @product
1:15.253
Scenario Outline The default XDG directories are usable in Tor Browser
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD without network and logged in 8.832
Then the live user's Videos directory exists 0.107
And there is a GNOME bookmark for the Videos directory 19.455
Then I start the Tor Browser in offline mode 22.675
And I can save the current page as "index.html" to the Videos GNOME bookmark 24.184
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'org.gnome.Nautilus' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/browser.rb:6:in `new'
./features/step_definitions/browser.rb:6:in `desktop_portal_save_as'
./features/step_definitions/browser.rb:38:in `browser_save_page_as'
./features/step_definitions/browser.rb:543:in `/^I (can|cannot) save the current page as "([^"]+[.]html)" to the (.*) (directory|GNOME bookmark)$/'
features/torified_browsing.feature:131:in `And I can save the current page as "index.html" to the Videos GNOME bookmark'
features/torified_browsing.feature:124:in `And I can save the current page as "index.html" to the <dir> GNOME bookmark'
After features/support/hooks.rb:330 9.128

SCENARIO FAILED: 'The default XDG directories are usable in Tor Browser, Examples (#5)' (at time 07:37:09)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:37:09_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:37:09_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:37:09_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:37:09_The_default_XDG_directories_are_usable_in_Tor_Browser,_Examples___5_.journal

After features/support/hooks.rb:100 0.000
54.115
Scenario Starting a second instance of the Unsafe Browser results in an error message being shown
Before features/support/hooks.rb:266 0.000
Before features/support/hooks.rb:273 0.006
Steps
Given I have started Tails from DVD and logged in and the network is connected 13.234
When I successfully start the Unsafe Browser 19.515
# Wait for whatever facility the GNOME Activities Overview uses to
# learn about which applications are running to "settle". Without
# this sleep, it is confused and it's impossible to start a new
# instance (it will just switch to the one we already started).
And I wait 10 seconds 10.008

Slept for 10 seconds

And I try to start the Unsafe Browser 0.401
Then I see a warning about another instance already running 10.954
The
The Dogtail init script raised: SearchError: child of [desktop frame | main]: 'zenity' application
STDOUT:

STDERR:

 (Dogtail::Failure)
./features/support/helpers/dogtail.rb:93:in `run'
./features/support/helpers/dogtail.rb:70:in `initialize'
./features/step_definitions/unsafe_browser.rb:76:in `new'
./features/step_definitions/unsafe_browser.rb:76:in `/^I see a warning about another instance already running$/'
features/unsafe_browser.feature:59:in `Then I see a warning about another instance already running'
After features/support/hooks.rb:330 11.155

SCENARIO FAILED: 'Starting a second instance of the Unsafe Browser results in an error message being shown' (at time 07:51:37)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:51:37_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:51:37_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:51:37_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:51:37_Starting_a_second_instance_of_the_Unsafe_Browser_results_in_an_error_message_being_shown.journal

After features/support/hooks.rb:100 0.000
22.489
Scenario Use Unlock VeraCrypt Volumes to unlock a USB drive that has a basic VeraCrypt volume
Steps
When I plug a USB drive containing a basic VeraCrypt volume 8.265
And I cancel the GNOME authentication prompt 3.114
And I unlock and mount this VeraCrypt volume with Unlock VeraCrypt Volumes 6.974
And I open this VeraCrypt volume in GNOME Files 3.387
Then I see the expected contents in this VeraCrypt volume 0.640
When I lock the currently opened VeraCrypt volume 0.109
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:18:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.607

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a USB drive that has a basic VeraCrypt volume' (at time 07:56:21)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:56:21_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:56:21_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:56:21_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:56:21_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.000
28.837
Scenario Use Unlock VeraCrypt Volumes to unlock a USB drive that has a hidden VeraCrypt volume
Steps
When I plug a USB drive containing a hidden VeraCrypt volume 11.818
And I cancel the GNOME authentication prompt 2.492
And I unlock and mount this VeraCrypt volume with Unlock VeraCrypt Volumes 9.830
And I open this VeraCrypt volume in GNOME Files 4.112
Then I see the expected contents in this VeraCrypt volume 0.473
When I lock the currently opened VeraCrypt volume 0.109
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:27:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.736

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a USB drive that has a hidden VeraCrypt volume' (at time 07:57:05)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:05_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:05_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:05_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:05_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.001
32.654
Scenario Use Unlock VeraCrypt Volumes to unlock a basic VeraCrypt file container with a PIM
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a PIM 17.516
And I unlock and mount this VeraCrypt file container with Unlock VeraCrypt Volumes 11.411
And I open this VeraCrypt volume in GNOME Files 3.094
Then I see the expected contents in this VeraCrypt volume 0.510
When I lock the currently opened VeraCrypt file container 0.121
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:35:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.250

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a basic VeraCrypt file container with a PIM' (at time 07:57:53)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:53_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:53_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:53_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:57:53_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.journal

After features/support/hooks.rb:100 0.000
41.698
Scenario Use Unlock VeraCrypt Volumes to unlock a hidden VeraCrypt file container
Steps
When I plug and mount a USB drive containing a hidden VeraCrypt file container 27.474
And I unlock and mount this VeraCrypt file container with Unlock VeraCrypt Volumes 9.603
And I open this VeraCrypt volume in GNOME Files 3.867
Then I see the expected contents in this VeraCrypt volume 0.565
When I lock the currently opened VeraCrypt file container 0.186
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:43:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.595

SCENARIO FAILED: 'Use Unlock VeraCrypt Volumes to unlock a hidden VeraCrypt file container' (at time 07:58:50)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:58:50_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:58:50_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:58:50_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:58:50_Use_Unlock_VeraCrypt_Volumes_to_unlock_a_hidden_VeraCrypt_file_container.journal

After features/support/hooks.rb:100 0.018
27.266
Scenario Use GNOME Disks to unlock a USB drive that has a basic VeraCrypt volume with a keyfile
Steps
When I plug a USB drive containing a basic VeraCrypt volume with a keyfile 8.387
And I cancel the GNOME authentication prompt 2.767
And I unlock and mount this VeraCrypt volume with GNOME Disks 11.190
And I open this VeraCrypt volume in GNOME Files 3.823
Then I see the expected contents in this VeraCrypt volume 0.817
When I lock the currently opened VeraCrypt volume 0.280
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:54:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.526

SCENARIO FAILED: 'Use GNOME Disks to unlock a USB drive that has a basic VeraCrypt volume with a keyfile' (at time 07:59:47)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:59:47_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:59:47_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:59:47_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/07:59:47_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_basic_VeraCrypt_volume_with_a_keyfile.journal

After features/support/hooks.rb:100 0.000
29.605
Scenario Use GNOME Disks to unlock a USB drive that has a hidden VeraCrypt volume
Steps
When I plug a USB drive containing a hidden VeraCrypt volume 11.379
And I cancel the GNOME authentication prompt 2.601
And I unlock and mount this VeraCrypt volume with GNOME Disks 11.005
And I open this VeraCrypt volume in GNOME Files 4.059
Then I see the expected contents in this VeraCrypt volume 0.442
When I lock the currently opened VeraCrypt volume 0.117
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:63:in `When I lock the currently opened VeraCrypt volume'
Then the VeraCrypt volume has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.907

SCENARIO FAILED: 'Use GNOME Disks to unlock a USB drive that has a hidden VeraCrypt volume' (at time 08:00:31)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:00:31_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:00:31_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:00:31_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:00:31_Use_GNOME_Disks_to_unlock_a_USB_drive_that_has_a_hidden_VeraCrypt_volume.journal

After features/support/hooks.rb:100 0.000
52.659
Scenario Use GNOME Disks to unlock a basic VeraCrypt file container with a keyfile
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a keyfile 24.430
And I unlock and mount this VeraCrypt file container with GNOME Disks 24.369
And I open this VeraCrypt volume in GNOME Files 3.311
Then I see the expected contents in this VeraCrypt volume 0.455
When I lock the currently opened VeraCrypt file container 0.092
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:71:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 8.759

SCENARIO FAILED: 'Use GNOME Disks to unlock a basic VeraCrypt file container with a keyfile' (at time 08:01:38)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:01:38_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:01:38_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:01:38_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:01:38_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_keyfile.journal

After features/support/hooks.rb:100 0.047
55.638
Scenario Use GNOME Disks to unlock a hidden VeraCrypt file container
Steps
When I plug and mount a USB drive containing a hidden VeraCrypt file container 27.956
And I unlock and mount this VeraCrypt file container with GNOME Disks 22.148
And I open this VeraCrypt volume in GNOME Files 4.743
Then I see the expected contents in this VeraCrypt volume 0.617
When I lock the currently opened VeraCrypt file container 0.171
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:79:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.391

SCENARIO FAILED: 'Use GNOME Disks to unlock a hidden VeraCrypt file container' (at time 08:02:52)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:02:52_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:02:52_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:02:52_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:02:52_Use_GNOME_Disks_to_unlock_a_hidden_VeraCrypt_file_container.journal

After features/support/hooks.rb:100 0.017
43.108
Scenario Use GNOME Disks to unlock a basic VeraCrypt file container with a PIM
Steps
When I plug and mount a USB drive containing a basic VeraCrypt file container with a PIM 17.462
And I unlock and mount this VeraCrypt file container with GNOME Disks 20.966
And I open this VeraCrypt volume in GNOME Files 4.104
Then I see the expected contents in this VeraCrypt volume 0.473
When I lock the currently opened VeraCrypt file container 0.101
Command failed: udisksctl unmount --block-device /dev/mapper/tcrypt-*
error code: 1
stdout: 
stderr: Error unmounting /dev/dm-0: GDBus.Error:org.freedesktop.UDisks2.Error.DeviceBusy: Error unmounting /dev/dm-0: target is busy
.
<false> is not true. (ExecutionFailedInVM)
./features/support/helpers/vm_helper.rb:496:in `rescue in execute_successfully'
./features/support/helpers/vm_helper.rb:493:in `execute_successfully'
./features/step_definitions/veracrypt.rb:330:in `/^I lock the currently opened VeraCrypt (volume|file container)$/'
features/veracrypt.feature:87:in `When I lock the currently opened VeraCrypt file container'
Then the VeraCrypt file container has been unmounted and locked 0.000
After features/support/hooks.rb:330 5.595

SCENARIO FAILED: 'Use GNOME Disks to unlock a basic VeraCrypt file container with a PIM' (at time 08:03:51)

Boot log: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:03:51_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.file_content_var_log_boot.log

Screenshot: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:03:51_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.png

Video: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:03:51_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.mkv

Systemd journal: https://jenkins.tails.boum.org/job/test_Tails_ISO_feature-trixie/28/artifact/build-artifacts/08:03:51_Use_GNOME_Disks_to_unlock_a_basic_VeraCrypt_file_container_with_a_PIM.journal

After features/support/hooks.rb:100 0.000