Automated App Store Screenshots with Fastlane
Every time you update your app, you need fresh screenshots for App Store Connect. Doing this manually across multiple devices and languages is tedious and error-prone. Fastlane’s snapshot action automates the entire process.
In this guide, I’ll walk through setting up automated screenshots, share lessons learned from iOS 26’s new TabView behavior, and explain why investing in proper test infrastructure pays off.
The Setup: What We’re Building
Our goal is to capture screenshots for:
- 2 devices: iPhone 17 Pro Max (6.9”) and iPad Pro 13” - App Store scales these to other sizes
- 2 languages: English and Polish
- 2 appearances: Light and dark mode
That’s 8 screenshot sets generated automatically with one command.
Project Structure
YourApp/
├── fastlane/
│ ├── Fastfile # Lane definitions
│ ├── Snapfile # Device/language config
│ └── screenshots/ # Generated output
│ ├── dark/
│ │ ├── en-US/
│ │ └── pl/
│ └── light/
│ ├── en-US/
│ └── pl/
├── YourAppUITests/
│ ├── ScreenshotTests.swift
│ └── SnapshotHelper.swift
└── YourApp/
├── Testing/
│ ├── StubDataProvider.swift
│ └── StubServices.swift
└── Utilities/
└── AccessibilityIdentifiers.swift
Step 1: Configure Snapfile
The Snapfile defines which devices and languages to capture:
# fastlane/Snapfile
# Devices - App Store scales from these to smaller sizes
devices([
"iPhone 17 Pro Max", # 6.9" - required, scales to all iPhones
"iPad Pro 13-inch (M5)" # 13" - required for iPad screenshots
])
# Languages to capture
languages([
"en-US",
"pl"
])
# Scheme containing UI Tests
scheme("YourApp")
# Output location
output_directory("./fastlane/screenshots")
# Clean slate each run
clear_previous_screenshots(true)
# Nice status bar (9:41, full battery)
override_status_bar(true)
# Run simulators in parallel for speed
concurrent_simulators(true)
Step 2: Separate Light and Dark Mode Tests
You need screenshots in both light and dark mode. There are two approaches:
Option A: Single test, switch appearance mid-run
func testAllScreenshots() {
for appearance in [XCUIDevice.Appearance.light, .dark] {
XCUIDevice.shared.appearance = appearance
app.launch()
snapshot("01_Main_\(appearance)")
// ... more screenshots
app.terminate()
}
}
Problems:
- App must terminate and relaunch to apply new appearance
- All screenshots go to same output folder (harder to organize)
- Can’t run just dark mode or just light mode independently
- If dark mode fails, you lose light mode screenshots too
Option B: Separate test functions (recommended)
Create distinct test functions for each appearance. Fastlane runs them as separate test invocations, setting simulator appearance before the app launches:
// YourAppUITests/ScreenshotTests.swift
final class ScreenshotTests: XCTestCase {
private var app: XCUIApplication!
@MainActor
override func setUpWithError() throws {
continueAfterFailure = false
app = XCUIApplication()
app.launchArguments.append("-UITest")
setupSnapshot(app)
// Force portrait orientation for consistent screenshots
XCUIDevice.shared.orientation = .portrait
}
@MainActor
private func launchAndCapture(appearance: XCUIDevice.Appearance) {
XCUIDevice.shared.appearance = appearance
app.launch()
// Wait for content to load
sleep(2)
// Capture screenshots...
snapshot("01_Main")
// Navigate and capture more...
}
@MainActor
func testLightMode() throws {
launchAndCapture(appearance: .light)
}
@MainActor
func testDarkMode() throws {
launchAndCapture(appearance: .dark)
}
}
Benefits:
- Fastlane sets appearance before app launch (no restart needed)
- Separate output directories (
screenshots/light/,screenshots/dark/) - Run independently:
fastlane screenshots_darkwhen iterating on dark mode - Parallelizable: light and dark can run on different simulators simultaneously
- If one fails, the other still completes
Then in your Fastfile, create lanes that run each test separately:
# fastlane/Fastfile
platform :ios do
desc "Capture screenshots in light mode"
lane :screenshots_light do
capture_screenshots(
dark_mode: false,
output_directory: "./fastlane/screenshots/light",
only_testing: ["YourAppUITests/ScreenshotTests/testLightMode"]
)
end
desc "Capture screenshots in dark mode"
lane :screenshots_dark do
capture_screenshots(
dark_mode: true,
output_directory: "./fastlane/screenshots/dark",
only_testing: ["YourAppUITests/ScreenshotTests/testDarkMode"]
)
end
desc "Capture all screenshots"
lane :screenshots do
screenshots_light
screenshots_dark
end
end
Step 3: Use Accessibility Identifiers (Never Hardcoded Strings)
This is crucial. Never use localized text to find UI elements:
// Bad - breaks with translations
let trendsTab = app.tabBars.buttons["Trends"]
if !trendsTab.exists {
trendsTab = app.tabBars.buttons["Trendy"] // Polish fallback
}
// Good - works regardless of language
let trendsTab = app.buttons[Identifiers.Tab.trends].firstMatch
Define identifiers in your main app:
// YourApp/Utilities/AccessibilityIdentifiers.swift
enum AccessibilityIdentifiers {
enum Tab {
static let current = "tab_current"
static let trends = "tab_trends"
static let settings = "tab_settings"
}
enum Main {
static let pressureCard = "card_pressure"
static let temperatureCard = "card_temperature"
// ...
}
}
Apply them to your views:
Tab("Trends", systemImage: "chart.line.uptrend.xyaxis", value: 1) {
TrendsView()
}
.accessibilityIdentifier(AccessibilityIdentifiers.Tab.trends)
Then import them in your UI tests (see Step 8 for details):
// YourAppUITests/ScreenshotTests.swift
import XCTest
@testable import YourApp
private typealias Identifiers = AccessibilityIdentifiers
iOS 26 Gotcha: TabView Renders Differently on iPad vs iPhone
On iOS 26 with Liquid Glass, TabView behaves differently across devices. iPad renders tabs as toolbar buttons (with working identifiers), while iPhone uses traditional tab bars (identifiers don’t work).
Quick fix: use .firstMatch for iPad’s duplicate elements, and fall back to index-based queries for iPhone.
For the full breakdown and a reusable helper, see iOS 26 TabView UI Tests: Making Identifiers Work Everywhere.
Step 4: Force Portrait Mode for iPhones
The simulator doesn’t guarantee orientation state between test runs. Without explicitly setting orientation:
- Simulator may retain landscape from a previous test or manual interaction
- Parallel test runs on cloned simulators can have inconsistent orientations
- App Store Connect rejects screenshots with wrong dimensions
iPhone screenshots must be portrait. App Store Connect expects specific pixel dimensions (e.g., 1320×2868 for iPhone 17 Pro Max). A landscape screenshot produces 2868×1320 - wrong dimensions, rejected upload.
@MainActor
override func setUpWithError() throws {
// ...
// CRITICAL: Force portrait before every test
// Simulator orientation persists across runs and can be unpredictable
XCUIDevice.shared.orientation = .portrait
}
Set this in setUpWithError(), not in the test function - it must happen before app.launch() so the app renders with correct dimensions from the start.
Note: iPad screenshots can be landscape if your app supports it, but ensure consistency - don’t mix orientations for the same screen.
Step 5: Mock Data for Consistent Screenshots
Never hit live servers in UI tests. This is a fundamental rule:
Why live servers break screenshot tests
- Flaky tests: Network timeouts, server errors, rate limiting - your CI fails randomly
- Non-deterministic data: Server returns different data each run - screenshots change unexpectedly
- Slow: Network latency adds seconds to every test, multiplied by devices × languages
- Uncontrollable content: Can’t guarantee attractive data for App Store screenshots
- Cost: API calls during parallel test runs can hit rate limits or incur charges
- External dependency: Server maintenance window = broken CI pipeline
The solution: Stub services with deterministic data
Inject mock services when running in UI test mode. The app never knows the difference - it just gets data from a “service” that happens to return hardcoded values:
// YourApp/Testing/StubDataProvider.swift
enum StubDataProvider {
// Always return the same attractive scenario
static var currentWeather: CurrentWeather {
CurrentWeather(
temperature: 27.0,
humidity: 30,
pressure: 1006.0,
// ... deterministic values that look good in screenshots
)
}
}
Base mocks on real data
Don’t invent data from scratch - it often looks fake. Fetch a real API response once, save it as reference, then model your stubs after it:
# Fetch real data for reference (run once, save the output)
curl -s "https://api.open-meteo.com/v1/forecast?latitude=52.23&longitude=21.01¤t=temperature_2m,humidity" | jq '.' > sample_response.json
Then build stubs that follow real-world patterns - realistic temperature cycles, proper humidity ranges, sensible pressure values. Your screenshots will look authentic because the data is authentic (just frozen in time).
Step 6: Skip the Splash Screen in UI Tests
Your splash screen animation wastes precious seconds during screenshot runs. Detect UI test mode and skip it:
// YourApp/Views/SplashView.swift
struct SplashView: View {
@State private var showContent = Container.isUITestMode // Skip in tests
var body: some View {
ZStack {
if showContent {
ContentView()
} else {
splashAnimation
}
}
.onAppear {
if !Container.isUITestMode {
// Only animate in production
DispatchQueue.main.asyncAfter(deadline: .now() + 1.5) {
showContent = true
}
}
}
}
}
Set the flag when launching with -UITest argument:
// YourApp/DI/AppContainer+Testing.swift
extension Container {
@MainActor
static func setupUITestMode() {
isUITestMode = true
shared.weatherService.register { StubWeatherService() }
shared.locationService.register { StubLocationService() }
}
}
Step 7: Test One Configuration First
Before running the full matrix (2 devices × 2 languages × 2 modes), verify everything works with a minimal configuration. Create a test lane that runs one appearance with one language but all devices:
# Add to fastlane/Fastfile temporarily
desc "Test dark mode screenshots (en-US only, all devices)"
lane :screenshots_dark_test do
capture_screenshots(
dark_mode: true,
output_directory: "./fastlane/screenshots/dark",
only_testing: ["YourAppUITests/ScreenshotTests/testDarkMode"],
languages: ["en-US"] # Single language, all devices from Snapfile
)
end
Run it:
fastlane screenshots_dark_test
Then check the folder contents - don’t grep test output:
ls -la fastlane/screenshots/dark/en-US/
# Expected - ALL devices should have ALL screenshots:
# iPad Pro 13-inch (M5)-01_Main.png
# iPad Pro 13-inch (M5)-02_Trends.png
# iPhone 17 Pro Max-01_Main.png
# iPhone 17 Pro Max-02_Trends.png
If any file is missing, something broke. Fix it before running the full suite.
Only after all expected files exist should you run fastlane screenshots.
Debugging Tips
Simulator issues
If simulators fail to launch:
# Boot the simulator manually first
xcrun simctl boot "iPad Pro 13-inch (M5)"
# Then run without parallel testing
xcodebuild test ... -parallel-testing-enabled NO
Element hierarchy inspection
When a test can’t find an element, print the hierarchy:
print("DEBUG: buttons = \(app.buttons.debugDescription)")
For detailed debugging of TabView issues on iOS 26, see iOS 26 TabView UI Tests: Making Identifiers Work Everywhere.
The Complete Workflow
- Develop your UI tests with accessibility identifiers
- Create stub data based on real API responses
- Skip splash screen in test mode
- Test one configuration to verify setup
- Run full screenshot generation:
# Generate all screenshots
fastlane screenshots
# Or just dark mode
fastlane screenshots_dark
- Review the output in
fastlane/screenshots/ - Upload to App Store Connect (with explicit approval):
fastlane upload_screenshots
Step 8: Share Identifiers via @testable import
Don’t duplicate your accessibility identifiers in UI tests. Instead, import them from the main app:
// YourAppUITests/ScreenshotTests.swift
import XCTest
@testable import YourApp
private typealias Identifiers = AccessibilityIdentifiers
final class ScreenshotTests: XCTestCase {
// Now use Identifiers.Tab.trends etc.
}
Benefits:
- Single source of truth - no sync issues
- IDE autocomplete works
- Refactoring updates both places automatically
The typealias keeps your test code concise while using the real identifiers.
Step 9: Uploading to App Store Connect
After generating screenshots, you need to upload them. This has a few gotchas.
Folder Structure for Upload
Fastlane’s deliver expects screenshots directly in locale folders:
fastlane/screenshots/
├── en-US/
│ ├── iPhone 17 Pro Max-01_Main.png
│ └── ...
└── pl/
└── ...
But our generation creates separate dark/ and light/ folders. To upload both appearances, merge them with renamed files:
cd fastlane/screenshots
mkdir -p en-US pl
for locale in en-US pl; do
# Dark screenshots (01, 02)
cp "dark/$locale/iPhone 17 Pro Max-01_Main.png" "$locale/iPhone 17 Pro Max-01_Main_Dark.png"
cp "dark/$locale/iPhone 17 Pro Max-02_Trends.png" "$locale/iPhone 17 Pro Max-02_Trends_Dark.png"
# Light screenshots (03, 04)
cp "light/$locale/iPhone 17 Pro Max-01_Main.png" "$locale/iPhone 17 Pro Max-03_Main_Light.png"
cp "light/$locale/iPhone 17 Pro Max-02_Trends.png" "$locale/iPhone 17 Pro Max-04_Trends_Light.png"
# Repeat for iPad...
done
# Remove nested folders (fastlane rejects them)
rm -rf dark light
The numbering (01, 02, 03, 04) controls display order in App Store Connect.
Fastlane Version Matters for iOS 26
iPhone 17 Pro Max (1320×2868) and iPad Pro 13” M5 (2064×2752) are new device sizes. Older fastlane versions reject them as “invalid screen size.”
# Check your version
fastlane --version
# If < 2.230.0, upgrade
brew upgrade fastlane
Version 2.230.0+ includes support for iOS 26 device resolutions.
Upload Lane Configuration
# fastlane/Fastfile
desc "Upload screenshots to App Store Connect"
lane :upload_screenshots do
deliver(
api_key_path: api_key_path,
skip_binary_upload: true,
skip_metadata: true,
overwrite_screenshots: true,
screenshots_path: "./fastlane/screenshots",
run_precheck_before_submit: false # Skip for screenshot-only uploads
)
end
Key options:
overwrite_screenshots: true- Replace existing screenshotsrun_precheck_before_submit: false- Precheck can’t validate IAPs with API keys, causes false failuresskip_metadata: true- Don’t touch metadata when only uploading screenshots
Running the Upload
fastlane upload_screenshots
This deletes existing screenshots and uploads the new ones. Verify in App Store Connect that all devices and languages have the expected screenshots.
Key Takeaways
- Separate light and dark mode tests - independent test functions for easier debugging and parallel runs
- Use accessibility identifiers, never localized strings - tests work across all languages
- Force portrait orientation - set in
setUpWithError(), App Store rejects wrong dimensions - Mock data based on real responses - screenshots look realistic, tests are deterministic
- Skip splash in test mode - faster test runs, check
-UITestlaunch argument - Test one config first, check folder contents - verify all files exist before full matrix
- Share identifiers via
@testable import- single source of truth, no duplication - Keep fastlane updated - iOS 26 device sizes require version 2.230.0+
- TabView identifiers need special handling - see companion post for iPad vs iPhone differences
Setting up automated screenshots takes some initial effort, but it pays off every release. No more manual device switching, no more forgetting to update one language, no more inconsistent status bars.
Run one command, get perfect screenshots.