将 SwiftUI 视图渲染到屏幕外并将视图另存为 UIImage 以共享
Posted
技术标签:
【中文标题】将 SwiftUI 视图渲染到屏幕外并将视图另存为 UIImage 以共享【英文标题】:Render SwiftUI view offscreen and save view as UIImage to share 【发布时间】:2021-02-15 19:32:21 【问题描述】:我正在尝试使用 SwiftUI 创建一个共享按钮,按下该按钮可以共享生成的图像。我发现一些教程可以截屏当前显示的视图并将其转换为 UIImage。但我想以编程方式在屏幕外创建一个视图,然后将其保存到 UIImage 中,然后用户可以通过共享表共享该视图。
import SwiftUI
import SwiftyJSON
import MapKit
struct ShareRentalView : View
@State private var region = MKCoordinateRegion(center: CLLocationCoordinate2D(latitude: 32.786038, longitude: -117.237324) , span: MKCoordinateSpan(latitudeDelta: 0.025, longitudeDelta: 0.025))
@State var coordinates: [JSON] = []
@State var origin: CGPoint? = nil
@State var size: CGSize? = nil
var body: some View
GeometryReader geometry in
VStack(spacing: 0)
ZStack
HistoryMapView(region: region, pointsArray: $coordinates)
.frame(height: 300)
.frame(height: 300)
.onAppear
self.origin = geometry.frame(in: .global).origin
self.size = geometry.size
func returnScreenShot() -> UIImage
return takeScreenshot(origin: self.origin.unsafelyUnwrapped, size: self.size.unsafelyUnwrapped)
extension UIView
var renderedImage: UIImage
// rect of capure
let rect = self.bounds
// create the context of bitmap
UIGraphicsBeginImageContextWithOptions(rect.size, false, 0.0)
let context: CGContext = UIGraphicsGetCurrentContext()!
self.layer.render(in: context)
// get a image from current context bitmap
let capturedImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return capturedImage
extension View
func takeScreenshot(origin: CGPoint, size: CGSize) -> UIImage
let window = UIWindow(frame: CGRect(origin: origin, size: size))
let hosting = UIHostingController(rootView: self)
hosting.view.frame = window.frame
window.addSubview(hosting.view)
window.makeKeyAndVisible()
return hosting.view.renderedImage
这是我目前的代码想法。我有一个视图,我已经建立了 onAppear 设置屏幕捕获的 CGpoint 和 CGsize 的视图。然后是一个附加的方法,然后可以拍摄视图的屏幕截图。现在这个视图的问题永远不会呈现,因为我从不将它添加到父视图,因为我不希望这个视图出现给用户。在父视图中我有
struct HistoryCell: View
...
private var shareRental : ShareRentalView? = nil
private var uiimage: UIImage? = nil
...
init()
...
self.shareRental = ShareRentalView()
var body: some View
...
Buttonaction: self.uiimage = self.shareRental?.returnScreenShot()
...
这不起作用,因为我想要截屏的视图从未呈现?有没有办法在内存或屏幕外渲染它,然后从中创建图像?还是我需要考虑另一种方法?
【问题讨论】:
这是否回答了您的问题***.com/a/59333377/12299030? 我已经切换到使用第一个答案中的代码,它只返回一个空白的 UIImage。它确实返回了一个 UIImage 并且它确实命中了刚刚检测到图像的代码的渲染部分。 【参考方案1】:这最终得到了屏幕上未显示的视图的屏幕截图以保存为 UIImage
extension UIView
func asImage() -> UIImage
let format = UIGraphicsImageRendererFormat()
format.scale = 1
return UIGraphicsImageRenderer(size: self.layer.frame.size, format: format).image context in
self.drawHierarchy(in: self.layer.bounds, afterScreenUpdates: true)
extension View
func asImage() -> UIImage
let controller = UIHostingController(rootView: self)
let size = controller.sizeThatFits(in: UIScreen.main.bounds.size)
controller.view.bounds = CGRect(origin: .zero, size: size)
let image = controller.view.asImage()
return image
然后在我的父视图中
var shareRental: ShareRentalView?
init()
....
self.shareRental = ShareRentalView()
var body: some View
Button(action:
let shareImage = self.shareRental.asImage()
这让我快到了。 MKMapSnapshotter 在加载时有延迟,并且图像创建发生得太快,并且在创建 UIImage 时没有地图。
为了解决地图加载延迟的问题,我在一个类中创建了一个数组,用于构建所有 UIImage 并将它们存储在一个数组中。
class MyUser: ObservableObject
...
public func buildHistoryRental()
self.historyRentals.removeAll()
MapSnapshot().generateSnapshot(completion: self.snapShotRsp)
private func snapShotRsp(image: UIImage)
self.historyRentals.append(image))
然后我创建了一个类来创建这样的快照图像
func generateSnapshot(completion: @escaping (JSON, UIImage)->() )
let mapSnapshotOptions = MKMapSnapshotOptions()
// Set the region of the map that is rendered. (by polyline)
let polyLine = MKPolyline(coordinates: &yourCoordinates, count: yourCoordinates.count)
let region = MKCoordinateRegionForMapRect(polyLine.boundingMapRect)
mapSnapshotOptions.region = region
// Set the scale of the image. We'll just use the scale of the current device, which is 2x scale on Retina screens.
mapSnapshotOptions.scale = UIScreen.main.scale
// Set the size of the image output.
mapSnapshotOptions.size = CGSize(width: IMAGE_VIEW_WIDTH, height: IMAGE_VIEW_HEIGHT)
// Show buildings and Points of Interest on the snapshot
mapSnapshotOptions.showsBuildings = true
mapSnapshotOptions.showsPointsOfInterest = true
let snapShotter = MKMapSnapshotter(options: mapSnapshotOptions)
var image: UIImage = UIImage()
snapshotter.start(completionHandler: (snapshot: MKMapSnapshotter.Snapshot?, Error) -> Void in
if(Error != nil)
print("\(String(describing: Error))");
else
image = self.drawLineOnImage(snapshot: snapshot.unsafelyUnwrapped, pointsToUse: pointsToUse)
completion(image)
)
func drawLineOnImage(snapshot: MKMapSnapshot) -> UIImage
let image = snapshot.image
// for Retina screen
UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, true, 0)
// draw original image into the context
image.draw(at: CGPoint.zero)
// get the context for CoreGraphics
let context = UIGraphicsGetCurrentContext()
// set stroking width and color of the context
context!.setLineWidth(2.0)
context!.setStrokeColor(UIColor.orange.cgColor)
// Here is the trick :
// We use addLine() and move() to draw the line, this should be easy to understand.
// The diificult part is that they both take CGPoint as parameters, and it would be way too complex for us to calculate by ourselves
// Thus we use snapshot.point() to save the pain.
context!.move(to: snapshot.point(for: yourCoordinates[0]))
for i in 0...yourCoordinates.count-1
context!.addLine(to: snapshot.point(for: yourCoordinates[i]))
context!.move(to: snapshot.point(for: yourCoordinates[i]))
// apply the stroke to the context
context!.strokePath()
// get the image from the graphics context
let resultImage = UIGraphicsGetImageFromCurrentImageContext()
// end the graphics context
UIGraphicsEndImageContext()
return resultImage!
将图像与回调异步返回很重要。尝试直接从 func 调用返回图像会产生空白地图。
【讨论】:
以上是关于将 SwiftUI 视图渲染到屏幕外并将视图另存为 UIImage 以共享的主要内容,如果未能解决你的问题,请参考以下文章