Replies: 2 comments
-
Hi, @loxai, Take a cue from these examples: You can also define your palette from an array (16-bit colors here): uint16_t const constexpr PALETTE[] = {
0x220c, // hsl(210, 50, 25) | #204060
0x43f7, // hsl(210, 50, 50) | #407fbf
0x8dfd, // hsl(210, 75, 75) | #8fbfef
0xffff // hsl(210, 100, 100) | #ffffff
};
sprite.createPalette(PALETTE, 4); On the other hand, if you don't want to be limited by the number of colors, you can also consider rendering in slices (horizontal or vertical) with micro-buffers. |
Beta Was this translation helpful? Give feedback.
0 replies
-
thanks! I got it working using the setPaletteColor method, as mentioned in the samples |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi (to tobozo, lovyan or any other welcomed expert :)
I'm using the lib on an m5stack. I've been working on a large project and bumping into memory problems.
Up until now, I have been using a framebuffer at 320x240x16bits to do fast graphics. But that's a good chunk of mem.
I tried using begin/end transactions, but I always get flicker. maybe I'm not doing it right (I might open a separate issue, after closing this one).
if I do a 320x240x4bit buffer, I'm saving a lot of mem and still have enough colors, but I don't see a way to set the palette (ie. a custom palette for 2 or 4 bit buffer). is there a way to set an rgb array as palette for a sprite or the LGFX_Device (currently I have the latter at 16bit, while testing sprite/buffer at 2 or 4 (only get color at 8)?
Kind regards
Beta Was this translation helpful? Give feedback.
All reactions