Tackling Image Squashing in SwiftUI

1563 words • 7 minute read.

Recently at work, I picked up a ticket which involved rendering a list of images. The list was made up of 2 columns, with 2 images in each row.

Each Image was to be rendered with a 1:1 aspect ratio, with a label underneath. Nothing too complicated. The only stipulation I had was that I didn’t want to define any fixed dimensions, allowing the image tile to fill the available space regardless of the iOS device. But when I tried to modify the shape of the images using the .aspectRatio() API, I didn’t get the results I expected.

Let me explain why…

Considering the images were being fetched from a remote URL, I decided to use the AsyncImage API and structured the list of images using a LazyVGrid.

The expected outcome was to look something like this:

An image of an iPhone showing a list of image tiles with  2 in each row, each with a title underneath

So first I defined my column arrangement:

private let columns = [
    GridItem(.flexible(), spacing: 16),
    GridItem(.flexible(), spacing: 16)
]

I then declared a @State property to hold my image objects which were created onAppear (just for the purpose of this article):

@State var gridItems: [TileItem] = []

Next I put the main view structure together:

ScrollView {
                                            
    LazyVGrid(columns: columns, spacing: 16) {
                                                
        ForEach(gridItems, id: \.self) { item in
                                                    
            TileView(
                imageURL: item.imageURL,
                    name: item.name
            )
        }
    }
    .padding(.horizontal, 16)
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.padding(.top, 18)
.onAppear {
    createArray()
}

As you can see above, this consisted of an outer ScrollView , with a LazyVGrid nested inside it using our column arrangement. Inside that we are using ForEach to iterate over the list of gridItems , passing the details of each object into a TileView view.

With that in place, let me show you my TileView .

public struct TileView: View {
                                                                
    let imageURL: String
    let name: String
                                                                   
    public var body: some View {
                                        
        VStack(alignment: .leading, spacing: 0) {
                                                
            tileImage
                                                
            VStack(alignment: .leading, spacing: 16) {
                                                    
                name
            }
            .padding(16)
        }
        .background(Color.gray.opacity(0.3))
        .cornerRadius(8)
    }
}

The TileView consisted of a parent VStack which held the image at the top, and a child VStack at the bottom which held the name psst (yes I know, I probably could have removed the bottom VStack, but in the real life project, there was more than just the name in there 😉).

Now the interesting part:

    @ViewBuilder
    var tileImage: some View {

    if let url = URL(string: imageURL) {
                                            
        AsyncImage(url: url) { phase in
                                                
            switch phase {
            case .empty:

                emptyView(loading: true)
            case .success(let image):

                image
                    .resizable()
                    .aspectRatio(1.0, contentMode: .fill)
                    .clipped()
                                                    
                case .failure:
                                                    
                    emptyView()
                @unknown default:
                                                    
                    emptyView()
            }
        }
    }
}

This was my AsyncImage view, everything in my opinion, looks like it should work, but when I ran the project this was the output:

An image of an iPhone showing a list of image tiles with  2 in each row, each with a title underneat, the images look squashed in from the sides making them not look that great.

Now on first glance, this may look ok, but when you look closer, these images are actually being squashed in from the sides which is not a good look.

So lets take a closer look at our implementation:

    image
	    .resizable()
	    .aspectRatio(1.0, contentMode: .fill)
	    .clipped()

First we have the image, this is the associated value of the success case from async image return property.

The .resizable() modifier then resizes the image to fit the available space.

Then we have .aspectRatio(1.0, contentMode: .fill) , this should according to the documentation:

Constrain a view’s dimensions to an aspect ratio specified by a CGFLOAT using the specified content mode.

Now the content mode is .fill , which according to the documentation should:

*An option that resizes the content so it occupies all available space, both vertically and horizontally.

This mode preserves the content’s aspect ratio. If the content doesn’t have the same aspect ratio as the available space, the content becomes the same size as the available space on one axis, and larger on the other axis.*

However, in practice, this wasn’t happening. Instead of the image filling the available space while preserving its aspect ratio, the images were being squashed. To illustrate, let’s consider this 3:2 image as an example:

A 16:9 image of a macbook in the middle of a wooden table, an open notebook and pen on the left and an iPhone 3GS on the right with a cup of espresso located underneath it

If this was one of the images I was passing into the above list, this is what is happening to it, its being squashed into the available space from the vertical edges:

The same image as above squashed into a 1:1 aspect ratio

When what should happen is that SwiftUI should take a cropped section of the image:

The same 16:9 image of the macbook with a 1:1 yellow square outline positioned over the centre showing the are which the image sropping shoule have taken rather than squashing the original image

And then make it look like the below:

A 1:1 image of the macbook, half a notebook to the left of it, the iPhone and half a cup of coffee on the right

Much better isn’t it. So It appears that when we apply the aspectRatio() modifier whilst specifying an aspect ratio of 1.0 and contentMode of .fill , SwiftUI is confining the image its applied to, to the specified aspect ratio, but its not respecting the images original aspect ratio.

So after some time spent trying to find a work around (whilst banging my head repeatedly up the wall) to see how else it could be done, I came up with the below solution:

Rectangle()
	.fill(Color.clear)
	.aspectRatio(1.0, contentMode: .fill)
	.background {
		image
			.resizable()
			.scaledToFill()
			.clipped()
		}

By declaring a clear Rectangle() , specifying its aspect ratio, and then passing our image into the background of it, it gives us the desired outcome. Just for clarification for anyone that doesn’t know, .scaledToFill() scales the image to fill its parents bounds while still keeping its original aspect ratio, and is the equivalent to using .aspectRatio(contentMode: .fill) .

Just so that you can see the two used together:

An image of an iPhone showing a list of image tiles with 2 in each row.  The top image showing the original squashed image, the image underneath showing the image with the corrected aspect ratio

It really does depend on the image of how much it appears to be affected, but you can clearly see the difference here with the bottom images looking much better than the top ones.

The final outcome with all images looking they way they are supposed to be:

An image of an iPhone showing a list of image tiles with  2 in each row, each with a title underneath.  The final image showing the images with the correct 1:1 aspect ratio

Conclusion

When .aspectRatio(1.0, contentMode: .fill) is applied to an image. I personally would have expected this to respect the images original aspect ratio whatever that may be, but modify the parent container of the image to a 1:1 aspect ratio and fill the available space leaving no white space around it.

This desired outcome can still be achieved using the above method, but I do wonder if this is a problem in the aspectRatio() API, or whether its just my understanding of how it should be used.

If any of you have a better understanding than me, and have an answer to my conundrum then please get in touch.

Thanks for stopping by 🙂.