Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Canvas image is displayed properly on emulator, but not shown after building release apk #338

Open
lilian-delouvy opened this issue Aug 18, 2024 · 1 comment

Comments

@lilian-delouvy
Copy link

lilian-delouvy commented Aug 18, 2024

Hello,

I'm trying to create a component to do 9-patch scaling of an image.

import React from 'react';
import { Dimensions } from 'react-native';
import {Image as RNimage} from 'react-native';
import Canvas, {Image as CanvasImage} from 'react-native-canvas';

const boardImageDimensions = {
    width: 1920,
    height: 1080
};

const resolveImage = () => {
    return require('../assets/images/board.jpg');
};

const NineSliceScaler = () => {

    const userScreenDimensions = {
        width: Dimensions.get('screen').width,
        height: Dimensions.get('screen').height
    };

    const renderCanvas = (canvas: any) => {
        if (!canvas) return;
    
        canvas.width = userScreenDimensions.width;
        canvas.height = userScreenDimensions.height;
        const ctx = canvas.getContext('2d');

        const img = new CanvasImage(canvas);
        const imageUri = RNimage.resolveAssetSource(resolveImage()).uri;
        img.src = imageUri;

        img.addEventListener('load', () => {
            // 9-slice scaling
            const sliceImgX = img.width / 3
            const sliceImgY = img.height / 3;

            // define the 9 parts as [x, y, width, height]
            const part1 = [0, 0, sliceImgX, sliceImgY]
            const part2 = [sliceImgX, 0, sliceImgX, sliceImgY]
            const part3 = [sliceImgX * 2, 0, sliceImgX, sliceImgY]
            const part4 = [0, sliceImgY, sliceImgX, sliceImgY]
            const part5 = [sliceImgX, sliceImgY, sliceImgX, sliceImgY]
            const part6 = [sliceImgX * 2, sliceImgY, sliceImgX, sliceImgY]
            const part7 = [0, sliceImgY * 2, sliceImgX, sliceImgY]
            const part8 = [sliceImgX, sliceImgY * 2, sliceImgX, sliceImgY]
            const part9 = [sliceImgX * 2, sliceImgY * 2, sliceImgX, sliceImgY]

            const width = userScreenDimensions.width;
            const height = userScreenDimensions.height;
            const sliceScreenX = width / 3;
            const sliceScreenY = height / 3;

            // draw the corners
            ctx.drawImage(img, ...part1, 0, 0, sliceScreenX, sliceScreenY); // top left
            ctx.drawImage(img, ...part3, width - sliceScreenX, 0, sliceScreenX, sliceScreenY) // top right
            ctx.drawImage(img, ...part7, 0, height - sliceScreenY, sliceScreenX, sliceScreenY) // bottom left
            ctx.drawImage(img, ...part9, width - sliceScreenX, height - sliceScreenY, sliceScreenX, sliceScreenY) // bottom right

            // draw the edges
            ctx.drawImage(img, ...part2, sliceScreenX, 0, width - 2 * sliceScreenX, sliceScreenY) // top
            ctx.drawImage(img, ...part8, sliceScreenX, height - sliceScreenY, width - 2 * sliceScreenX, sliceScreenY) // bottom
            ctx.drawImage(img, ...part4, 0, sliceScreenY, sliceScreenX, height - 2 * sliceScreenY) // left
            ctx.drawImage(img, ...part6, width - sliceScreenX, sliceScreenY, sliceScreenX, height - 2 * sliceScreenY) // right

            // draw the center
            ctx.drawImage(img, ...part5, sliceScreenX, sliceScreenY, width - 2 * sliceScreenX, height - 2 * sliceScreenY)
        });

        
    };

    return (
        <Canvas ref={renderCanvas}/>
    );
};

export default NineSliceScaler;

The emulator shows the image properly, but creating a release apk and using it on a real device does not show the canvas.

There is no logged error, nothing, as if it worked properly.

Is there something I'm missing here ? I'm fairly new to react-native, so that might be the case.

I already took a look in the examples folder you provide, but none of the examples display usage of a local image. I'm wondering if there is a bug with the local image handling

@lilian-delouvy
Copy link
Author

Found the solution.

It seems like the library has trouble with local images. There are plenty examples of this library using distant images (take a look at the "example/app/index.tsx" file), but none with local images. Turns out, using

const img = new CanvasImage(canvas);
const imageUri = RNimage.resolveAssetSource(resolveImage()).uri;
img.src = imageUri;

does not work. What works instead, is to encode the image in base64, then doing this:

const image = new CanvasImage(canvas);
image.src = ""data:image/png;base64,<YOUR_BASE_64_HERE>"
image.addEventListener("load", () => {
      ctx.drawImage(image, 0, 0, 100, 100);  // dimensions are here only as an example
});

I feel like there is a lack of documentation here / lack of example, as I saw quite a lot of people banging their heads with this. I'll try to provide a PR to display this, but will close this issue for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant