Compare commits

..

11 Commits

52 changed files with 7996 additions and 273 deletions

View File

@@ -190,6 +190,11 @@
z-index: 50; z-index: 50;
} }
/* Reconnect-Anker immer pointer-interactive halten (Drag-Detach/Reconnect) */
.react-flow__edgeupdater {
pointer-events: all;
}
/* Proximity-Vorschaukante (temp) */ /* Proximity-Vorschaukante (temp) */
.react-flow__edge.temp { .react-flow__edge.temp {
opacity: 0.9; opacity: 0.9;

View File

@@ -144,16 +144,23 @@ render: 300 × 420 mixer: 360 × 320
- **Handles:** genau zwei Inputs links (`base`, `overlay`) und ein Output rechts (`mixer-out`). - **Handles:** genau zwei Inputs links (`base`, `overlay`) und ein Output rechts (`mixer-out`).
- **Erlaubte Inputs:** `image`, `asset`, `ai-image`, `render`. - **Erlaubte Inputs:** `image`, `asset`, `ai-image`, `render`.
- **Connection-Limits:** maximal 2 eingehende Kanten insgesamt, davon pro Handle genau 1. - **Connection-Limits:** maximal 2 eingehende Kanten insgesamt, davon pro Handle genau 1.
- **Node-Data (V1):** `blendMode` (`normal|multiply|screen|overlay`), `opacity` (0..100), `offsetX`, `offsetY`. - **Node-Data (V1):** `blendMode` (`normal|multiply|screen|overlay`), `opacity` (0..100), `overlayX`, `overlayY`, `overlayWidth`, `overlayHeight` (Frame-Rect, normiert 0..1) plus `contentX`, `contentY`, `contentWidth`, `contentHeight` (Content-Framing innerhalb des Overlay-Frames, ebenfalls normiert 0..1).
- **Output-Semantik:** pseudo-image (clientseitig aus Graph + Controls aufgeloest), kein persistiertes Asset, kein Storage-Write. - **Output-Semantik:** pseudo-image (clientseitig aus Graph + Controls aufgeloest), kein persistiertes Asset, kein Storage-Write.
- **UI/Interaction:** nur Inline-Formcontrols im Node; keine Drag-Manipulation im Preview, keine Rotation/Skalierung/Masks. - **UI/Interaction:** Zwei Modi im Preview: `Frame resize` (Overlay-Frame verschieben + ueber Corner-Handles resizen) und `Content framing` (Overlay-Inhalt innerhalb des Frames verschieben). Numerische Inline-Controls bleiben als Feineinstellung erhalten.
- **Sizing/Crop-Verhalten:** Der Overlay-Inhalt wird `object-cover`-aehnlich in den Content-Rect eingepasst; bei abweichenden Seitenverhaeltnissen wird zentriert gecroppt.
### Compare-Integration (V1) ### Compare-Integration (V1)
- `compare` versteht `mixer`-Outputs ueber `lib/canvas-mixer-preview.ts`. - `compare` versteht `mixer`-Outputs ueber `lib/canvas-mixer-preview.ts`.
- Die Vorschau wird als DOM/CSS-Layering im Client gerendert (inkl. Blend/Opacity/Offset). - Die Vorschau wird als DOM/CSS-Layering im Client gerendert (inkl. Blend/Opacity/Overlay-Rect).
- Scope bleibt eng: keine pauschale pseudo-image-Unterstuetzung fuer alle Consumer in V1. - Scope bleibt eng: keine pauschale pseudo-image-Unterstuetzung fuer alle Consumer in V1.
### Render-Bake-Pfad (V1)
- Offizieller Bake-Flow: `mixer -> render`.
- `render` konsumiert die Mixer-Komposition (`sourceComposition.kind = "mixer"`) und nutzt sie fuer Preview + finalen Render/Upload.
- `mixer -> adjustments -> render` ist bewusst verschoben (deferred) und aktuell nicht offizieller Scope.
--- ---
## Node-Status-Modell ## Node-Status-Modell
@@ -325,7 +332,8 @@ useCanvasData (use-canvas-data.ts)
- **Node-Taxonomie:** Alle Node-Typen sind in `lib/canvas-node-catalog.ts` definiert. Phase-2/3 Nodes haben `implemented: false` und `disabledHint`. - **Node-Taxonomie:** Alle Node-Typen sind in `lib/canvas-node-catalog.ts` definiert. Phase-2/3 Nodes haben `implemented: false` und `disabledHint`.
- **Video-Connection-Policy:** `video-prompt` darf **nur** mit `ai-video` verbunden werden (und umgekehrt). `text → video-prompt` ist erlaubt (Prompt-Quelle). `ai-video → compare` ist erlaubt. - **Video-Connection-Policy:** `video-prompt` darf **nur** mit `ai-video` verbunden werden (und umgekehrt). `text → video-prompt` ist erlaubt (Prompt-Quelle). `ai-video → compare` ist erlaubt.
- **Mixer-Connection-Policy:** `mixer` akzeptiert nur `image|asset|ai-image|render`; Ziel-Handles sind nur `base` und `overlay`, pro Handle maximal eine eingehende Kante, insgesamt maximal zwei. - **Mixer-Connection-Policy:** `mixer` akzeptiert nur `image|asset|ai-image|render`; Ziel-Handles sind nur `base` und `overlay`, pro Handle maximal eine eingehende Kante, insgesamt maximal zwei.
- **Mixer-Pseudo-Output:** `mixer` liefert in V1 kein persistiertes Bild. Downstream-Nodes muessen den pseudo-image-Resolver nutzen (aktuell gezielt fuer `compare`). - **Mixer-Pseudo-Output:** `mixer` liefert in V1 kein persistiertes Bild. Offizielle Consumer sind `compare` und der direkte Bake-Pfad `mixer -> render`; `mixer -> adjustments -> render` bleibt vorerst deferred.
- **Mixer Legacy-Daten:** Alte `offsetX`/`offsetY`-Mixer-Daten werden beim Lesen auf den Full-Frame-Fallback (`overlay* = 0/0/1/1`) normalisiert; Content-Framing defaults auf `content* = 0/0/1/1`.
- **Agent-Flow:** `agent` akzeptiert nur Content-/Kontext-Quellen (z. B. `render`, `compare`, `text`, `image`) als Input; ausgehende Kanten sind fuer `agent -> agent-output` vorgesehen. - **Agent-Flow:** `agent` akzeptiert nur Content-/Kontext-Quellen (z. B. `render`, `compare`, `text`, `image`) als Input; ausgehende Kanten sind fuer `agent -> agent-output` vorgesehen.
- **Convex Generated Types:** `api.ai.generateVideo` wird u. U. nicht in `convex/_generated/api.d.ts` exportiert. Der Code verwendet `api as unknown as {...}` als Workaround. Ein `npx convex dev`-Zyklus würde die Typen korrekt generieren. - **Convex Generated Types:** `api.ai.generateVideo` wird u. U. nicht in `convex/_generated/api.d.ts` exportiert. Der Code verwendet `api as unknown as {...}` als Workaround. Ein `npx convex dev`-Zyklus würde die Typen korrekt generieren.
- **Canvas Graph Query:** Der Canvas nutzt `canvasGraph.get` (aus `convex/canvasGraph.ts`) statt separater `nodes.list`/`edges.list` Queries. Optimistic Updates laufen über `canvas-graph-query-cache.ts`. - **Canvas Graph Query:** Der Canvas nutzt `canvasGraph.get` (aus `convex/canvasGraph.ts`) statt separater `nodes.list`/`edges.list` Queries. Optimistic Updates laufen über `canvas-graph-query-cache.ts`.

View File

@@ -0,0 +1,184 @@
import type { Edge as RFEdge, Node as RFNode } from "@xyflow/react";
import { describe, expect, it } from "vitest";
import { projectCanvasFavoritesVisibility } from "../canvas-favorites-visibility";
function createNode(
id: string,
data: Record<string, unknown>,
options?: {
style?: RFNode<Record<string, unknown>>["style"];
className?: string;
},
): RFNode<Record<string, unknown>> {
return {
id,
position: { x: 0, y: 0 },
data,
style: options?.style,
className: options?.className,
type: "note",
};
}
function createEdge(
id: string,
source: string,
target: string,
options?: {
style?: RFEdge<Record<string, unknown>>["style"];
className?: string;
},
): RFEdge<Record<string, unknown>> {
return {
id,
source,
target,
style: options?.style,
className: options?.className,
type: "default",
};
}
describe("projectCanvasFavoritesVisibility", () => {
it("keeps nodes and edges unchanged when favorites focus mode is inactive", () => {
const nodes = [
createNode("node-a", { isFavorite: true }),
createNode("node-b", { label: "normal" }, { style: { width: 280, height: 200 } }),
];
const edges = [
createEdge("edge-a", "node-a", "node-b", {
style: { stroke: "rgb(0, 0, 0)", strokeWidth: 2 },
}),
];
const result = projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: false,
});
expect(result.nodes[0]).toBe(nodes[0]);
expect(result.nodes[1]).toBe(nodes[1]);
expect(result.edges[0]).toBe(edges[0]);
expect(result.favoriteCount).toBe(1);
expect(Array.from(result.favoriteNodeIds)).toEqual(["node-a"]);
expect(result.nodes[1]?.style).toEqual({ width: 280, height: 200 });
expect(result.edges[0]?.style).toEqual({ stroke: "rgb(0, 0, 0)", strokeWidth: 2 });
});
it("dims non-favorite nodes when favorites focus mode is active", () => {
const nodes = [
createNode("node-a", { isFavorite: true }),
createNode("node-b", { label: "normal" }, { className: "custom-node" }),
createNode("node-c", { isFavorite: true }),
];
const edges: RFEdge<Record<string, unknown>>[] = [];
const result = projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: true,
});
expect(result.nodes[0]).toBe(nodes[0]);
expect(result.nodes[2]).toBe(nodes[2]);
expect(result.nodes[1]).not.toBe(nodes[1]);
expect(result.nodes[1]?.style).toMatchObject({
opacity: 0.28,
filter: "saturate(0.55)",
});
expect(result.nodes[1]?.className).toContain("custom-node");
expect(result.favoriteCount).toBe(2);
expect(Array.from(result.favoriteNodeIds)).toEqual(["node-a", "node-c"]);
});
it("dims edges when source and target are not both favorite", () => {
const nodes = [
createNode("node-a", { isFavorite: true }),
createNode("node-b", { label: "normal" }),
createNode("node-c", { isFavorite: true }),
];
const edges = [
createEdge("edge-aa", "node-a", "node-c", {
style: { stroke: "rgb(10, 10, 10)", strokeWidth: 2 },
}),
createEdge("edge-ab", "node-a", "node-b", {
style: { stroke: "rgb(20, 20, 20)", strokeWidth: 2 },
}),
createEdge("edge-bc", "node-b", "node-c", {
style: { stroke: "rgb(30, 30, 30)", strokeWidth: 2 },
}),
];
const result = projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: true,
});
expect(result.edges[0]).toBe(edges[0]);
expect(result.edges[1]).not.toBe(edges[1]);
expect(result.edges[2]).not.toBe(edges[2]);
expect(result.edges[0]?.style).toEqual({ stroke: "rgb(10, 10, 10)", strokeWidth: 2 });
expect(result.edges[1]?.style).toMatchObject({
stroke: "rgb(20, 20, 20)",
strokeWidth: 2,
opacity: 0.18,
});
expect(result.edges[2]?.style).toMatchObject({
stroke: "rgb(30, 30, 30)",
strokeWidth: 2,
opacity: 0.18,
});
expect(result.edges[0]).toBe(edges[0]);
});
it("does not mutate input nodes or edges and only changes affected items", () => {
const nodes = [
createNode("node-a", { isFavorite: true }),
createNode("node-b", { label: "normal" }, { style: { width: 240 } }),
createNode("node-c", { isFavorite: true }, { style: { width: 180 } }),
];
const edges = [
createEdge("edge-ab", "node-a", "node-b", { style: { stroke: "red" } }),
createEdge("edge-ac", "node-a", "node-c", { style: { stroke: "green" } }),
createEdge("edge-bc", "node-b", "node-c", { style: { stroke: "blue" } }),
];
const nodesBefore = structuredClone(nodes);
const edgesBefore = structuredClone(edges);
const result = projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: true,
});
expect(nodes).toEqual(nodesBefore);
expect(edges).toEqual(edgesBefore);
expect(result.nodes[0]).toBe(nodes[0]);
expect(result.nodes[1]).not.toBe(nodes[1]);
expect(result.nodes[2]).toBe(nodes[2]);
expect(result.edges[0]).not.toBe(edges[0]);
expect(result.edges[1]).toBe(edges[1]);
expect(result.edges[2]).not.toBe(edges[2]);
});
it("dims all nodes and edges when focus mode is active with zero favorites", () => {
const nodes = [createNode("node-a", { label: "first" }), createNode("node-b", { label: "second" })];
const edges = [createEdge("edge-ab", "node-a", "node-b")];
const result = projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: true,
});
expect(result.favoriteCount).toBe(0);
expect(result.favoriteNodeIds.size).toBe(0);
expect(result.nodes[0]?.style).toMatchObject({ opacity: 0.28, filter: "saturate(0.55)" });
expect(result.nodes[1]?.style).toMatchObject({ opacity: 0.28, filter: "saturate(0.55)" });
expect(result.edges[0]?.style).toMatchObject({ opacity: 0.18 });
});
});

View File

@@ -4,13 +4,25 @@ import React, { act, useEffect } from "react";
import { createRoot, type Root } from "react-dom/client"; import { createRoot, type Root } from "react-dom/client";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { HANDLE_SNAP_RADIUS_PX } from "@/components/canvas/canvas-connection-magnetism"; import {
HANDLE_GLOW_RADIUS_PX,
HANDLE_SNAP_RADIUS_PX,
} from "@/components/canvas/canvas-connection-magnetism";
import { import {
CanvasConnectionMagnetismProvider, CanvasConnectionMagnetismProvider,
useCanvasConnectionMagnetism, useCanvasConnectionMagnetism,
} from "@/components/canvas/canvas-connection-magnetism-context"; } from "@/components/canvas/canvas-connection-magnetism-context";
const connectionStateRef: { current: { inProgress: boolean } } = { const connectionStateRef: {
current: {
inProgress?: boolean;
fromNode?: { id: string };
fromHandle?: { id?: string; type?: "source" | "target" };
toNode?: { id: string } | null;
toHandle?: { id?: string | null; type?: "source" | "target" } | null;
isValid?: boolean | null;
};
} = {
current: { inProgress: false }, current: { inProgress: false },
}; };
@@ -72,12 +84,20 @@ describe("CanvasHandle", () => {
}); });
} }
container?.remove(); container?.remove();
document.documentElement.classList.remove("dark");
container = null; container = null;
root = null; root = null;
}); });
async function renderHandle(args?: { async function renderHandle(args?: {
connectionState?: {
inProgress?: boolean; inProgress?: boolean;
fromNode?: { id: string };
fromHandle?: { id?: string; type?: "source" | "target" };
toNode?: { id: string } | null;
toHandle?: { id?: string | null; type?: "source" | "target" } | null;
isValid?: boolean | null;
};
activeTarget?: { activeTarget?: {
nodeId: string; nodeId: string;
handleId?: string; handleId?: string;
@@ -88,7 +108,7 @@ describe("CanvasHandle", () => {
} | null; } | null;
props?: Partial<React.ComponentProps<typeof CanvasHandle>>; props?: Partial<React.ComponentProps<typeof CanvasHandle>>;
}) { }) {
connectionStateRef.current = { inProgress: args?.inProgress ?? false }; connectionStateRef.current = args?.connectionState ?? { inProgress: false };
await act(async () => { await act(async () => {
root?.render( root?.render(
@@ -98,7 +118,7 @@ describe("CanvasHandle", () => {
nodeId="node-1" nodeId="node-1"
nodeType="image" nodeType="image"
type="target" type="target"
position="left" position={"left" as React.ComponentProps<typeof CanvasHandle>["position"]}
id="image-in" id="image-in"
{...args?.props} {...args?.props}
/> />
@@ -128,7 +148,7 @@ describe("CanvasHandle", () => {
it("turns on near-target glow when this handle is active target", async () => { it("turns on near-target glow when this handle is active target", async () => {
await renderHandle({ await renderHandle({
inProgress: true, connectionState: { inProgress: true },
activeTarget: { activeTarget: {
nodeId: "node-1", nodeId: "node-1",
handleId: "image-in", handleId: "image-in",
@@ -145,7 +165,7 @@ describe("CanvasHandle", () => {
it("renders a stronger glow in snapped state than near state", async () => { it("renders a stronger glow in snapped state than near state", async () => {
await renderHandle({ await renderHandle({
inProgress: true, connectionState: { inProgress: true },
activeTarget: { activeTarget: {
nodeId: "node-1", nodeId: "node-1",
handleId: "image-in", handleId: "image-in",
@@ -160,7 +180,7 @@ describe("CanvasHandle", () => {
const nearGlow = nearHandle.style.boxShadow; const nearGlow = nearHandle.style.boxShadow;
await renderHandle({ await renderHandle({
inProgress: true, connectionState: { inProgress: true },
activeTarget: { activeTarget: {
nodeId: "node-1", nodeId: "node-1",
handleId: "image-in", handleId: "image-in",
@@ -176,9 +196,45 @@ describe("CanvasHandle", () => {
expect(snappedHandle.style.boxShadow).not.toBe(nearGlow); expect(snappedHandle.style.boxShadow).not.toBe(nearGlow);
}); });
it("ramps up glow intensity as pointer gets closer within glow radius", async () => {
await renderHandle({
connectionState: { inProgress: true },
activeTarget: {
nodeId: "node-1",
handleId: "image-in",
handleType: "target",
centerX: 120,
centerY: 80,
distancePx: HANDLE_GLOW_RADIUS_PX - 1,
},
});
const farHandle = getHandleElement();
const farStrength = Number(farHandle.getAttribute("data-glow-strength") ?? "0");
await renderHandle({
connectionState: { inProgress: true },
activeTarget: {
nodeId: "node-1",
handleId: "image-in",
handleType: "target",
centerX: 120,
centerY: 80,
distancePx: HANDLE_SNAP_RADIUS_PX + 1,
},
});
const nearHandle = getHandleElement();
const nearStrength = Number(nearHandle.getAttribute("data-glow-strength") ?? "0");
expect(farHandle.getAttribute("data-glow-state")).toBe("near");
expect(nearHandle.getAttribute("data-glow-state")).toBe("near");
expect(nearStrength).toBeGreaterThan(farStrength);
});
it("does not glow for non-target handles during the same drag", async () => { it("does not glow for non-target handles during the same drag", async () => {
await renderHandle({ await renderHandle({
inProgress: true, connectionState: { inProgress: true },
activeTarget: { activeTarget: {
nodeId: "other-node", nodeId: "other-node",
handleId: "image-in", handleId: "image-in",
@@ -193,13 +249,88 @@ describe("CanvasHandle", () => {
expect(handle.getAttribute("data-glow-state")).toBe("idle"); expect(handle.getAttribute("data-glow-state")).toBe("idle");
}); });
it("shows glow while dragging when connection payload exists without inProgress", async () => {
await renderHandle({
connectionState: {
fromNode: { id: "source-node" },
fromHandle: { id: "image-out", type: "source" },
},
activeTarget: {
nodeId: "node-1",
handleId: "image-in",
handleType: "target",
centerX: 120,
centerY: 80,
distancePx: HANDLE_SNAP_RADIUS_PX + 2,
},
});
const handle = getHandleElement();
expect(handle.getAttribute("data-glow-state")).toBe("near");
});
it("shows glow from native connection hover target even without custom magnet target", async () => {
await renderHandle({
connectionState: {
inProgress: true,
isValid: true,
toNode: { id: "node-1" },
toHandle: { id: "image-in", type: "target" },
},
activeTarget: null,
});
const handle = getHandleElement();
expect(handle.getAttribute("data-glow-state")).toBe("snapped");
});
it("adapts glow rendering between light and dark modes", async () => {
await renderHandle({
connectionState: { inProgress: true },
activeTarget: {
nodeId: "node-1",
handleId: "image-in",
handleType: "target",
centerX: 120,
centerY: 80,
distancePx: HANDLE_SNAP_RADIUS_PX + 1,
},
});
const lightHandle = getHandleElement();
const lightShadow = lightHandle.style.boxShadow;
const lightMode = lightHandle.getAttribute("data-glow-mode");
document.documentElement.classList.add("dark");
await renderHandle({
connectionState: { inProgress: true },
activeTarget: {
nodeId: "node-1",
handleId: "image-in",
handleType: "target",
centerX: 120,
centerY: 80,
distancePx: HANDLE_SNAP_RADIUS_PX + 1,
},
});
const darkHandle = getHandleElement();
const darkShadow = darkHandle.style.boxShadow;
const darkMode = darkHandle.getAttribute("data-glow-mode");
expect(lightMode).toBe("light");
expect(darkMode).toBe("dark");
expect(darkShadow).not.toBe(lightShadow);
});
it("emits stable handle geometry data attributes", async () => { it("emits stable handle geometry data attributes", async () => {
await renderHandle({ await renderHandle({
props: { props: {
nodeId: "node-2", nodeId: "node-2",
id: undefined, id: undefined,
type: "source", type: "source",
position: "right", position: "right" as React.ComponentProps<typeof CanvasHandle>["position"],
}, },
}); });

View File

@@ -0,0 +1,77 @@
// @vitest-environment jsdom
import React, { act } from "react";
import { createRoot, type Root } from "react-dom/client";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import CanvasSidebar from "@/components/canvas/canvas-sidebar";
import type { Id } from "@/convex/_generated/dataModel";
vi.mock("@/hooks/use-auth-query", () => ({
useAuthQuery: () => ({ name: "Demo Canvas" }),
}));
vi.mock("@/components/canvas/canvas-user-menu", () => ({
CanvasUserMenu: ({ compact }: { compact?: boolean }) => (
<div data-testid={compact ? "canvas-user-menu-compact" : "canvas-user-menu-full"} />
),
}));
vi.mock("@/components/ui/progressive-blur", () => ({
ProgressiveBlur: () => <div data-testid="progressive-blur" />,
}));
vi.mock("next/image", () => ({
__esModule: true,
default: ({ alt }: { alt?: string }) => <span role="img" aria-label={alt ?? ""} />,
}));
(globalThis as typeof globalThis & { IS_REACT_ACT_ENVIRONMENT?: boolean }).IS_REACT_ACT_ENVIRONMENT = true;
describe("CanvasSidebar", () => {
let container: HTMLDivElement | null = null;
let root: Root | null = null;
beforeEach(() => {
container = document.createElement("div");
document.body.appendChild(container);
root = createRoot(container);
});
afterEach(async () => {
if (root) {
await act(async () => {
root?.unmount();
});
}
container?.remove();
container = null;
root = null;
});
it("shows divider-separated category sections with two-column grids in rail mode", async () => {
await act(async () => {
root?.render(
<CanvasSidebar canvasId={"canvas-1" as Id<"canvases">} railMode />,
);
});
const text = container?.textContent ?? "";
expect(text).not.toContain("QUELLE");
expect(text).not.toContain("KI-AUSGABE");
const sourceGrid = container?.querySelector(
'[data-testid="sidebar-rail-category-source-grid"]',
);
expect(sourceGrid).not.toBeNull();
expect(sourceGrid?.className).toContain("grid-cols-2");
const aiOutputDivider = container?.querySelector(
'[data-testid="sidebar-rail-category-ai-output-divider"]',
);
expect(aiOutputDivider).not.toBeNull();
const allCategoryGrids = container?.querySelectorAll('[data-testid$="-grid"]');
expect(allCategoryGrids?.length ?? 0).toBeGreaterThan(1);
});
});

View File

@@ -0,0 +1,173 @@
// @vitest-environment jsdom
import React, { act } from "react";
import { createRoot, type Root } from "react-dom/client";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
const mocks = vi.hoisted(() => ({
createNodeWithIntersection: vi.fn(async () => undefined),
getCenteredPosition: vi.fn(() => ({ x: 0, y: 0 })),
}));
vi.mock("@/components/canvas/canvas-placement-context", () => ({
useCanvasPlacement: () => ({
createNodeWithIntersection: mocks.createNodeWithIntersection,
}),
}));
vi.mock("@/hooks/use-centered-flow-node-position", () => ({
useCenteredFlowNodePosition: () => mocks.getCenteredPosition,
}));
vi.mock("@/components/ui/dropdown-menu", () => ({
DropdownMenu: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
DropdownMenuTrigger: ({ children }: { children: React.ReactNode }) => <>{children}</>,
DropdownMenuContent: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
DropdownMenuItem: ({ children }: { children: React.ReactNode }) => <button type="button">{children}</button>,
DropdownMenuLabel: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
DropdownMenuSeparator: () => <hr />,
}));
vi.mock("@/components/canvas/credit-display", () => ({
CreditDisplay: () => <div data-testid="credit-display" />,
}));
vi.mock("@/components/canvas/export-button", () => ({
ExportButton: ({ canvasName }: { canvasName: string }) => (
<button type="button">Export {canvasName}</button>
),
}));
vi.mock("@/lib/canvas-node-catalog", () => ({
NODE_CATEGORIES_ORDERED: [],
NODE_CATEGORY_META: {},
catalogEntriesByCategory: () => new Map(),
getTemplateForCatalogType: () => null,
isNodePaletteEnabled: () => false,
}));
import CanvasToolbar from "@/components/canvas/canvas-toolbar";
(globalThis as typeof globalThis & { IS_REACT_ACT_ENVIRONMENT?: boolean }).IS_REACT_ACT_ENVIRONMENT = true;
describe("CanvasToolbar", () => {
let container: HTMLDivElement | null = null;
let root: Root | null = null;
beforeEach(() => {
mocks.createNodeWithIntersection.mockClear();
mocks.getCenteredPosition.mockClear();
container = document.createElement("div");
document.body.appendChild(container);
root = createRoot(container);
});
afterEach(async () => {
if (root) {
await act(async () => {
root?.unmount();
});
}
container?.remove();
container = null;
root = null;
});
it("renders the favorites filter button", async () => {
await act(async () => {
root?.render(
<CanvasToolbar
activeTool="select"
onToolChange={vi.fn()}
onFavoriteFilterChange={vi.fn()}
/>,
);
});
const favoriteButton = container?.querySelector('button[title="Favoriten hervorheben"]');
expect(favoriteButton).not.toBeNull();
});
it("reflects active state via aria-pressed", async () => {
await act(async () => {
root?.render(
<CanvasToolbar
activeTool="select"
onToolChange={vi.fn()}
favoriteFilterActive={false}
onFavoriteFilterChange={vi.fn()}
/>,
);
});
let favoriteButton = container?.querySelector('button[title="Favoriten hervorheben"]');
expect(favoriteButton?.getAttribute("aria-pressed")).toBe("false");
await act(async () => {
root?.render(
<CanvasToolbar
activeTool="select"
onToolChange={vi.fn()}
favoriteFilterActive
onFavoriteFilterChange={vi.fn()}
/>,
);
});
favoriteButton = container?.querySelector('button[title="Favoriten hervorheben"]');
expect(favoriteButton?.getAttribute("aria-pressed")).toBe("true");
});
it("toggles and calls onFavoriteFilterChange", async () => {
const onFavoriteFilterChange = vi.fn();
await act(async () => {
root?.render(
<CanvasToolbar
activeTool="select"
onToolChange={vi.fn()}
favoriteFilterActive={false}
onFavoriteFilterChange={onFavoriteFilterChange}
/>,
);
});
const favoriteButton = container?.querySelector('button[title="Favoriten hervorheben"]');
if (!(favoriteButton instanceof HTMLButtonElement)) {
throw new Error("Favorite filter button not found");
}
await act(async () => {
favoriteButton.click();
});
expect(onFavoriteFilterChange).toHaveBeenCalledTimes(1);
expect(onFavoriteFilterChange).toHaveBeenCalledWith(true);
onFavoriteFilterChange.mockClear();
await act(async () => {
root?.render(
<CanvasToolbar
activeTool="select"
onToolChange={vi.fn()}
favoriteFilterActive
onFavoriteFilterChange={onFavoriteFilterChange}
/>,
);
});
const activeFavoriteButton = container?.querySelector('button[title="Favoriten hervorheben"]');
if (!(activeFavoriteButton instanceof HTMLButtonElement)) {
throw new Error("Active favorite filter button not found");
}
await act(async () => {
activeFavoriteButton.click();
});
expect(onFavoriteFilterChange).toHaveBeenCalledTimes(1);
expect(onFavoriteFilterChange).toHaveBeenCalledWith(false);
});
});

View File

@@ -1,5 +1,9 @@
// @vitest-environment jsdom
import React from "react"; import React from "react";
import { beforeEach, describe, expect, it, vi } from "vitest"; import { act } from "react";
import { createRoot, type Root } from "react-dom/client";
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { renderToStaticMarkup } from "react-dom/server"; import { renderToStaticMarkup } from "react-dom/server";
import { CanvasGraphProvider } from "@/components/canvas/canvas-graph-context"; import { CanvasGraphProvider } from "@/components/canvas/canvas-graph-context";
@@ -15,12 +19,20 @@ type StoreState = {
}>; }>;
}; };
type ResizeObserverEntryLike = {
target: Element;
contentRect: { width: number; height: number };
};
const storeState: StoreState = { const storeState: StoreState = {
nodes: [], nodes: [],
edges: [], edges: [],
}; };
const compareSurfaceSpy = vi.fn(); const compareSurfaceSpy = vi.fn();
let resizeObserverCallback:
| ((entries: ResizeObserverEntryLike[]) => void)
| null = null;
vi.mock("@xyflow/react", () => ({ vi.mock("@xyflow/react", () => ({
Handle: () => null, Handle: () => null,
@@ -53,6 +65,14 @@ vi.mock("@/components/canvas/canvas-handle", () => ({
), ),
})); }));
vi.mock("@/hooks/use-pipeline-preview", () => ({
usePipelinePreview: () => ({
canvasRef: { current: null },
isRendering: false,
error: null,
}),
}));
vi.mock("../nodes/base-node-wrapper", () => ({ vi.mock("../nodes/base-node-wrapper", () => ({
default: ({ children }: { children: React.ReactNode }) => <div>{children}</div>, default: ({ children }: { children: React.ReactNode }) => <div>{children}</div>,
})); }));
@@ -66,6 +86,8 @@ vi.mock("../nodes/compare-surface", () => ({
import CompareNode from "../nodes/compare-node"; import CompareNode from "../nodes/compare-node";
(globalThis as typeof globalThis & { IS_REACT_ACT_ENVIRONMENT?: boolean }).IS_REACT_ACT_ENVIRONMENT = true;
function renderCompareNode(props: Record<string, unknown>) { function renderCompareNode(props: Record<string, unknown>) {
return renderToStaticMarkup( return renderToStaticMarkup(
<CanvasGraphProvider <CanvasGraphProvider
@@ -78,10 +100,47 @@ function renderCompareNode(props: Record<string, unknown>) {
} }
describe("CompareNode render preview inputs", () => { describe("CompareNode render preview inputs", () => {
let container: HTMLDivElement | null = null;
let root: Root | null = null;
beforeEach(() => { beforeEach(() => {
storeState.nodes = []; storeState.nodes = [];
storeState.edges = []; storeState.edges = [];
compareSurfaceSpy.mockReset(); compareSurfaceSpy.mockReset();
resizeObserverCallback = null;
globalThis.ResizeObserver = class ResizeObserver {
constructor(callback: (entries: ResizeObserverEntryLike[]) => void) {
resizeObserverCallback = callback;
}
observe(target: Element) {
resizeObserverCallback?.([
{
target,
contentRect: { width: 500, height: 380 },
},
]);
}
unobserve() {}
disconnect() {}
} as unknown as typeof ResizeObserver;
container = document.createElement("div");
document.body.appendChild(container);
root = createRoot(container);
});
afterEach(async () => {
if (root) {
await act(async () => {
root?.unmount();
});
}
container?.remove();
root = null;
container = null;
}); });
it("passes previewInput to CompareSurface for a connected render node without final output", () => { it("passes previewInput to CompareSurface for a connected render node without final output", () => {
@@ -192,6 +251,108 @@ describe("CompareNode render preview inputs", () => {
}); });
}); });
it("defaults mixer-backed render compare inputs to preview mode when only sourceComposition exists", () => {
storeState.nodes = [
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-image",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
blendMode: "multiply",
opacity: 62,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.5,
cropLeft: 0.1,
cropTop: 0,
cropRight: 0.2,
cropBottom: 0.1,
},
},
{
id: "render-1",
type: "render",
data: {
lastUploadUrl: "https://cdn.example.com/stale-render-output.png",
},
},
];
storeState.edges = [
{
id: "edge-base-mixer",
source: "base-image",
target: "mixer-1",
targetHandle: "base",
},
{
id: "edge-overlay-mixer",
source: "overlay-image",
target: "mixer-1",
targetHandle: "overlay",
},
{ id: "edge-mixer-render", source: "mixer-1", target: "render-1" },
{
id: "edge-render-compare",
source: "render-1",
target: "compare-1",
targetHandle: "left",
},
];
renderCompareNode({
id: "compare-1",
data: { leftUrl: "https://cdn.example.com/stale-render-output.png" },
selected: false,
dragging: false,
zIndex: 0,
isConnectable: true,
type: "compare",
xPos: 0,
yPos: 0,
width: 500,
height: 380,
sourcePosition: undefined,
targetPosition: undefined,
positionAbsoluteX: 0,
positionAbsoluteY: 0,
});
expect(compareSurfaceSpy).toHaveBeenCalledTimes(1);
expect(compareSurfaceSpy.mock.calls[0]?.[0]).toMatchObject({
finalUrl: "https://cdn.example.com/stale-render-output.png",
preferPreview: true,
previewInput: {
sourceUrl: null,
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "multiply",
opacity: 62,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.5,
cropLeft: 0.1,
cropTop: 0,
cropRight: 0.2,
cropBottom: 0.1,
},
steps: [],
},
});
});
it("prefers mixer composite preview over persisted compare finalUrl when mixer is connected", () => { it("prefers mixer composite preview over persisted compare finalUrl when mixer is connected", () => {
storeState.nodes = [ storeState.nodes = [
{ {
@@ -275,14 +436,22 @@ describe("CompareNode render preview inputs", () => {
); );
expect(mixerCall?.[0]).toMatchObject({ expect(mixerCall?.[0]).toMatchObject({
finalUrl: undefined, finalUrl: undefined,
nodeWidth: 500,
nodeHeight: 380,
mixerPreviewState: { mixerPreviewState: {
status: "ready", status: "ready",
baseUrl: "https://cdn.example.com/base.png", baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png", overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "multiply", blendMode: "multiply",
opacity: 62, opacity: 62,
offsetX: 12, overlayX: 0,
offsetY: -4, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}, },
}); });
}); });
@@ -317,4 +486,183 @@ describe("CompareNode render preview inputs", () => {
expect(markup).toContain('data-top="35%"'); expect(markup).toContain('data-top="35%"');
expect(markup).toContain('data-top="55%"'); expect(markup).toContain('data-top="55%"');
}); });
it("passes the measured compare surface size to mixer previews instead of the full node box", async () => {
storeState.nodes = [
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-image",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
blendMode: "normal",
opacity: 100,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.6,
overlayHeight: 0.5,
},
},
];
storeState.edges = [
{
id: "edge-base-mixer",
source: "base-image",
target: "mixer-1",
targetHandle: "base",
},
{
id: "edge-overlay-mixer",
source: "overlay-image",
target: "mixer-1",
targetHandle: "overlay",
},
{
id: "edge-mixer-compare",
source: "mixer-1",
target: "compare-1",
targetHandle: "left",
},
];
await act(async () => {
root?.render(
<CanvasGraphProvider
nodes={storeState.nodes as Array<{ id: string; type: string; data?: unknown }>}
edges={storeState.edges}
>
<CompareNode
{...({
id: "compare-1",
data: {},
selected: false,
dragging: false,
zIndex: 0,
isConnectable: true,
type: "compare",
xPos: 0,
yPos: 0,
width: 640,
height: 480,
sourcePosition: undefined,
targetPosition: undefined,
positionAbsoluteX: 0,
positionAbsoluteY: 0,
} as unknown as React.ComponentProps<typeof CompareNode>)}
/>
</CanvasGraphProvider>,
);
});
await vi.waitFor(() => {
const latestCompareSurfaceCall = compareSurfaceSpy.mock.calls.findLast(
([props]) =>
Boolean((props as { mixerPreviewState?: { status?: string } }).mixerPreviewState),
);
expect(latestCompareSurfaceCall?.[0]).toMatchObject({
nodeWidth: 500,
nodeHeight: 380,
});
});
const surfaceElement = container?.querySelector(".nodrag.relative.min-h-0.w-full");
expect(surfaceElement).toBeInstanceOf(HTMLDivElement);
await act(async () => {
resizeObserverCallback?.([
{
target: surfaceElement as HTMLDivElement,
contentRect: { width: 468, height: 312 },
},
]);
});
const latestCompareSurfaceCall = compareSurfaceSpy.mock.calls.findLast(
([props]) =>
Boolean((props as { mixerPreviewState?: { status?: string } }).mixerPreviewState),
);
expect(latestCompareSurfaceCall?.[0]).toMatchObject({
nodeWidth: 468,
nodeHeight: 312,
});
expect(latestCompareSurfaceCall?.[0]).not.toMatchObject({
nodeWidth: 640,
nodeHeight: 480,
});
});
it("anchors direct mixer previews to the actual compare surface rect", async () => {
const compareSurfaceModule = await vi.importActual<typeof import("../nodes/compare-surface")>(
"../nodes/compare-surface",
);
const ActualCompareSurface = compareSurfaceModule.default;
await act(async () => {
root?.render(
<CanvasGraphProvider nodes={[]} edges={[]}>
<ActualCompareSurface
mixerPreviewState={{
status: "ready",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}}
nodeWidth={500}
nodeHeight={380}
/>
</CanvasGraphProvider>,
);
});
const images = container?.querySelectorAll("img");
const baseImage = images?.[0];
if (!(baseImage instanceof HTMLImageElement)) {
throw new Error("base image not found");
}
Object.defineProperty(baseImage, "naturalWidth", { configurable: true, value: 200 });
Object.defineProperty(baseImage, "naturalHeight", { configurable: true, value: 100 });
await act(async () => {
baseImage.dispatchEvent(new Event("load"));
});
const overlayImage = container?.querySelectorAll("img")?.[1];
if (!(overlayImage instanceof HTMLImageElement)) {
throw new Error("overlay image not found");
}
Object.defineProperty(overlayImage, "naturalWidth", { configurable: true, value: 200 });
Object.defineProperty(overlayImage, "naturalHeight", { configurable: true, value: 100 });
await act(async () => {
overlayImage.dispatchEvent(new Event("load"));
});
const overlayFrame = overlayImage.parentElement;
expect(overlayFrame?.style.left).toBe("0%");
expect(overlayFrame?.style.top).toBe("17.105263157894736%");
expect(overlayFrame?.style.width).toBe("100%");
expect(overlayFrame?.style.height).toBe("65.78947368421053%");
});
}); });

View File

@@ -19,11 +19,23 @@ const reactFlowStateRef: {
current: { current: {
nodes: Array<{ id: string; type: string; position: { x: number; y: number }; data: object }>; nodes: Array<{ id: string; type: string; position: { x: number; y: number }; data: object }>;
edges: Array<{ id: string; source: string; target: string; targetHandle?: string | null }>; edges: Array<{ id: string; source: string; target: string; targetHandle?: string | null }>;
screenToFlowPosition: ({ x, y }: { x: number; y: number }) => { x: number; y: number };
}; };
} = { } = {
current: { current: {
nodes: [], nodes: [],
edges: [], edges: [],
screenToFlowPosition: ({ x, y }) => ({ x, y }),
},
};
const connectionStateRef: {
current: {
fromHandle?: { type?: "source" | "target" };
};
} = {
current: {
fromHandle: { type: "source" },
}, },
}; };
@@ -35,7 +47,9 @@ vi.mock("@xyflow/react", async () => {
useReactFlow: () => ({ useReactFlow: () => ({
getNodes: () => reactFlowStateRef.current.nodes, getNodes: () => reactFlowStateRef.current.nodes,
getEdges: () => reactFlowStateRef.current.edges, getEdges: () => reactFlowStateRef.current.edges,
screenToFlowPosition: reactFlowStateRef.current.screenToFlowPosition,
}), }),
useConnection: () => connectionStateRef.current,
}; };
}); });
@@ -86,6 +100,7 @@ describe("CustomConnectionLine", () => {
document document
.querySelectorAll("[data-testid='custom-line-magnet-handle']") .querySelectorAll("[data-testid='custom-line-magnet-handle']")
.forEach((element) => element.remove()); .forEach((element) => element.remove());
document.documentElement.classList.remove("dark");
container = null; container = null;
root = null; root = null;
}); });
@@ -93,6 +108,10 @@ describe("CustomConnectionLine", () => {
function renderLine(args?: { function renderLine(args?: {
withMagnetHandle?: boolean; withMagnetHandle?: boolean;
connectionStatus?: ConnectionLineComponentProps["connectionStatus"]; connectionStatus?: ConnectionLineComponentProps["connectionStatus"];
omitFromHandleType?: boolean;
toX?: number;
toY?: number;
pointer?: { x: number; y: number };
}) { }) {
document document
.querySelectorAll("[data-testid='custom-line-magnet-handle']") .querySelectorAll("[data-testid='custom-line-magnet-handle']")
@@ -104,6 +123,11 @@ describe("CustomConnectionLine", () => {
{ id: "target-node", type: "render", position: { x: 0, y: 0 }, data: {} }, { id: "target-node", type: "render", position: { x: 0, y: 0 }, data: {} },
], ],
edges: [], edges: [],
screenToFlowPosition: ({ x, y }) => ({ x, y }),
};
connectionStateRef.current = {
fromHandle: { type: "source" },
}; };
if (args?.withMagnetHandle && container) { if (args?.withMagnetHandle && container) {
@@ -128,11 +152,22 @@ describe("CustomConnectionLine", () => {
} }
act(() => { act(() => {
const lineProps = {
...baseProps,
...(args?.toX !== undefined ? { toX: args.toX } : null),
...(args?.toY !== undefined ? { toY: args.toY } : null),
...(args?.pointer ? { pointer: args.pointer } : null),
fromHandle: {
...baseProps.fromHandle,
...(args?.omitFromHandleType ? { type: undefined } : null),
},
} as ConnectionLineComponentProps;
root?.render( root?.render(
<CanvasConnectionMagnetismProvider> <CanvasConnectionMagnetismProvider>
<svg> <svg>
<CustomConnectionLine <CustomConnectionLine
{...baseProps} {...lineProps}
connectionStatus={args?.connectionStatus ?? "valid"} connectionStatus={args?.connectionStatus ?? "valid"}
/> />
</svg> </svg>
@@ -170,6 +205,17 @@ describe("CustomConnectionLine", () => {
expect(path.getAttribute("d")).toContain("220"); expect(path.getAttribute("d")).toContain("220");
}); });
it("still resolves magnet target when fromHandle.type is missing", () => {
renderLine({
withMagnetHandle: true,
omitFromHandleType: true,
});
const path = getPath();
expect(path.getAttribute("d")).toContain("300");
expect(path.getAttribute("d")).toContain("220");
});
it("strengthens stroke visual feedback while snapped", () => { it("strengthens stroke visual feedback while snapped", () => {
renderLine(); renderLine();
const idlePath = getPath(); const idlePath = getPath();
@@ -185,6 +231,29 @@ describe("CustomConnectionLine", () => {
expect(snappedPath.style.filter).not.toBe(idleFilter); expect(snappedPath.style.filter).not.toBe(idleFilter);
}); });
it("ramps stroke feedback up as pointer gets closer before snap", () => {
renderLine({
withMagnetHandle: true,
toX: 252,
toY: 220,
pointer: { x: 252, y: 220 },
});
const farNearPath = getPath();
const farNearWidth = Number(farNearPath.style.strokeWidth || "0");
renderLine({
withMagnetHandle: true,
toX: 266,
toY: 220,
pointer: { x: 266, y: 220 },
});
const closeNearPath = getPath();
const closeNearWidth = Number(closeNearPath.style.strokeWidth || "0");
expect(farNearWidth).toBeGreaterThan(2.5);
expect(closeNearWidth).toBeGreaterThan(farNearWidth);
});
it("keeps invalid connection opacity behavior while snapped", () => { it("keeps invalid connection opacity behavior while snapped", () => {
renderLine({ renderLine({
withMagnetHandle: true, withMagnetHandle: true,
@@ -194,4 +263,48 @@ describe("CustomConnectionLine", () => {
const path = getPath(); const path = getPath();
expect(path.style.opacity).toBe("0.45"); expect(path.style.opacity).toBe("0.45");
}); });
it("uses client pointer coordinates for magnet lookup and converts snapped endpoint back to flow space", () => {
reactFlowStateRef.current.screenToFlowPosition = ({ x, y }) => ({
x: Math.round(x / 10),
y: Math.round(y / 10),
});
renderLine({
withMagnetHandle: true,
toX: 29,
toY: 21,
pointer: { x: 300, y: 220 },
});
const path = getPath();
expect(path.getAttribute("d")).toContain("30");
expect(path.getAttribute("d")).toContain("22");
});
it("adjusts glow filter between light and dark mode", () => {
renderLine({
withMagnetHandle: true,
toX: 266,
toY: 220,
pointer: { x: 266, y: 220 },
});
const lightPath = getPath();
const lightFilter = lightPath.style.filter;
document.documentElement.classList.add("dark");
renderLine({
withMagnetHandle: true,
toX: 266,
toY: 220,
pointer: { x: 266, y: 220 },
});
const darkPath = getPath();
const darkFilter = darkPath.style.filter;
expect(lightFilter).not.toBe("");
expect(darkFilter).not.toBe("");
expect(darkFilter).not.toBe(lightFilter);
});
}); });

View File

@@ -8,6 +8,12 @@ import { afterEach, describe, expect, it, vi } from "vitest";
import DefaultEdge from "@/components/canvas/edges/default-edge"; import DefaultEdge from "@/components/canvas/edges/default-edge";
const mockViewport = {
x: 0,
y: 0,
zoom: 1,
};
vi.mock("@xyflow/react", async () => { vi.mock("@xyflow/react", async () => {
const actual = await vi.importActual<typeof import("@xyflow/react")>( const actual = await vi.importActual<typeof import("@xyflow/react")>(
"@xyflow/react", "@xyflow/react",
@@ -18,6 +24,7 @@ vi.mock("@xyflow/react", async () => {
EdgeLabelRenderer: ({ children }: { children: ReactNode }) => ( EdgeLabelRenderer: ({ children }: { children: ReactNode }) => (
<foreignObject>{children}</foreignObject> <foreignObject>{children}</foreignObject>
), ),
useViewport: () => mockViewport,
}; };
}); });
@@ -95,6 +102,8 @@ describe("DefaultEdge", () => {
let container: HTMLDivElement | null = null; let container: HTMLDivElement | null = null;
afterEach(() => { afterEach(() => {
mockViewport.zoom = 1;
if (root) { if (root) {
act(() => { act(() => {
root?.unmount(); root?.unmount();
@@ -185,4 +194,33 @@ describe("DefaultEdge", () => {
expect(edgePath).not.toBeNull(); expect(edgePath).not.toBeNull();
expect(edgePath?.getAttribute("d")).toBeTruthy(); expect(edgePath?.getAttribute("d")).toBeTruthy();
}); });
it("applies zoom-aware scaling with clamps to keep the plus legible", () => {
const onInsertClick = vi.fn<(anchor: EdgeInsertAnchor) => void>();
mockViewport.zoom = 0.2;
({ container, root } = renderEdge({ onInsertClick, isMenuOpen: true }));
const insertButton = getInsertButton(container);
expect(insertButton.style.transform).toContain("scale(2.2)");
mockViewport.zoom = 4;
act(() => {
root?.render(
<svg>
<DefaultEdgeComponent {...baseProps} onInsertClick={onInsertClick} isMenuOpen />
</svg>,
);
});
expect(insertButton.style.transform).toContain("scale(0.95)");
});
it("uses stronger visual styling for distant zoom visibility", () => {
const onInsertClick = vi.fn<(anchor: EdgeInsertAnchor) => void>();
({ container, root } = renderEdge({ onInsertClick, isMenuOpen: true }));
const insertButton = getInsertButton(container);
expect(insertButton.className).toContain("border-2");
expect(insertButton.className).toContain("ring-1");
expect(insertButton.className).toContain("shadow");
});
}); });

File diff suppressed because it is too large Load Diff

View File

@@ -218,8 +218,10 @@ describe("useCanvasConnections", () => {
defaultData: { defaultData: {
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
}, },
}), }),
); );
@@ -232,8 +234,10 @@ describe("useCanvasConnections", () => {
data: { data: {
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
}, },
}), }),
); );
@@ -539,6 +543,74 @@ describe("useCanvasConnections", () => {
expect(latestHandlersRef.current?.connectionDropMenu).toBeNull(); expect(latestHandlersRef.current?.connectionDropMenu).toBeNull();
}); });
it("rejects self-drops on a note instead of auto-splitting its incoming edge", async () => {
const runCreateEdgeMutation = vi.fn(async () => undefined);
const runSplitEdgeAtExistingNodeMutation = vi.fn(async () => undefined);
const showConnectionRejectedToast = vi.fn();
container = document.createElement("div");
document.body.appendChild(container);
root = createRoot(container);
await act(async () => {
root?.render(
<HookHarness
helperResult={{
sourceNodeId: "node-note",
targetNodeId: "node-note",
sourceHandle: undefined,
targetHandle: undefined,
}}
runCreateEdgeMutation={runCreateEdgeMutation}
runSplitEdgeAtExistingNodeMutation={runSplitEdgeAtExistingNodeMutation}
showConnectionRejectedToast={showConnectionRejectedToast}
nodes={[
{ id: "node-image", type: "image", position: { x: 0, y: 0 }, data: {} },
{ id: "node-note", type: "note", position: { x: 240, y: 120 }, data: {} },
]}
edges={[
{
id: "edge-image-note",
source: "node-image",
target: "node-note",
},
]}
/>,
);
});
await act(async () => {
latestHandlersRef.current?.onConnectStart?.(
{} as MouseEvent,
{
nodeId: "node-note",
handleId: null,
handleType: "source",
} as never,
);
latestHandlersRef.current?.onConnectEnd(
{ clientX: 260, clientY: 160 } as MouseEvent,
{
isValid: false,
from: { x: 0, y: 0 },
fromNode: { id: "node-note", type: "note" },
fromHandle: { id: null, type: "source" },
fromPosition: null,
to: { x: 260, y: 160 },
toHandle: null,
toNode: null,
toPosition: null,
pointer: null,
} as never,
);
});
expect(runSplitEdgeAtExistingNodeMutation).not.toHaveBeenCalled();
expect(runCreateEdgeMutation).not.toHaveBeenCalled();
expect(showConnectionRejectedToast).toHaveBeenCalledWith("self-loop");
expect(latestHandlersRef.current?.connectionDropMenu).toBeNull();
});
it("rejects text to ai-video body drops", async () => { it("rejects text to ai-video body drops", async () => {
const runCreateEdgeMutation = vi.fn(async () => undefined); const runCreateEdgeMutation = vi.fn(async () => undefined);
const showConnectionRejectedToast = vi.fn(); const showConnectionRejectedToast = vi.fn();

View File

@@ -5,6 +5,52 @@ import { validateCanvasConnectionPolicy } from "@/lib/canvas-connection-policy";
export const HANDLE_GLOW_RADIUS_PX = 56; export const HANDLE_GLOW_RADIUS_PX = 56;
export const HANDLE_SNAP_RADIUS_PX = 40; export const HANDLE_SNAP_RADIUS_PX = 40;
function clamp01(value: number): number {
if (!Number.isFinite(value)) {
return 0;
}
if (value <= 0) {
return 0;
}
if (value >= 1) {
return 1;
}
return value;
}
function smoothstep(value: number): number {
const v = clamp01(value);
return v * v * (3 - 2 * v);
}
export function resolveCanvasGlowStrength(args: {
distancePx: number;
glowRadiusPx?: number;
snapRadiusPx?: number;
}): number {
const glowRadius = args.glowRadiusPx ?? HANDLE_GLOW_RADIUS_PX;
const snapRadius = args.snapRadiusPx ?? HANDLE_SNAP_RADIUS_PX;
if (!Number.isFinite(args.distancePx)) {
return 0;
}
if (args.distancePx <= 0) {
return 1;
}
if (args.distancePx >= glowRadius) {
return 0;
}
if (args.distancePx <= snapRadius) {
return 1;
}
const preSnapRange = Math.max(1, glowRadius - snapRadius);
const progressToSnap = (glowRadius - args.distancePx) / preSnapRange;
const eased = smoothstep(progressToSnap);
return 0.22 + eased * 0.68;
}
export type CanvasMagnetTarget = { export type CanvasMagnetTarget = {
nodeId: string; nodeId: string;
handleId?: string; handleId?: string;

View File

@@ -0,0 +1,140 @@
import type { Edge as RFEdge, Node as RFNode } from "@xyflow/react";
import { readNodeFavorite } from "@/lib/canvas-node-favorite";
type CanvasNode = RFNode<Record<string, unknown>>;
type CanvasEdge = RFEdge<Record<string, unknown>>;
type ProjectCanvasFavoritesVisibilityArgs = {
nodes: readonly CanvasNode[];
edges: readonly CanvasEdge[];
favoritesOnly: boolean;
};
type ProjectCanvasFavoritesVisibilityResult = {
nodes: CanvasNode[];
edges: CanvasEdge[];
favoriteNodeIds: ReadonlySet<string>;
favoriteCount: number;
};
const DIMMED_NODE_OPACITY = 0.28;
const DIMMED_NODE_FILTER = "saturate(0.55)";
const DIMMED_EDGE_OPACITY = 0.18;
const DIMMED_NODE_TRANSITION = "opacity 160ms ease, filter 160ms ease";
const DIMMED_EDGE_TRANSITION = "opacity 160ms ease";
function mergeTransition(current: unknown, required: string): string {
if (typeof current !== "string" || current.trim().length === 0) {
return required;
}
if (current.includes(required)) {
return current;
}
return `${current}, ${required}`;
}
function shallowEqualRecord(
left: Record<string, unknown> | undefined,
right: Record<string, unknown> | undefined,
): boolean {
if (left === right) {
return true;
}
if (!left || !right) {
return false;
}
const leftKeys = Object.keys(left);
const rightKeys = Object.keys(right);
if (leftKeys.length !== rightKeys.length) {
return false;
}
return leftKeys.every((key) => Object.is(left[key], right[key]));
}
function styleToRecord(
style: CanvasNode["style"] | CanvasEdge["style"],
): Record<string, unknown> | undefined {
return style ? (style as Record<string, unknown>) : undefined;
}
function buildDimmedNodeStyle(
style: CanvasNode["style"],
): CanvasNode["style"] {
const next = {
...(style ?? {}),
opacity: DIMMED_NODE_OPACITY,
filter: DIMMED_NODE_FILTER,
transition: mergeTransition(style?.transition, DIMMED_NODE_TRANSITION),
};
return next;
}
function buildDimmedEdgeStyle(
style: CanvasEdge["style"],
): CanvasEdge["style"] {
const next = {
...(style ?? {}),
opacity: DIMMED_EDGE_OPACITY,
transition: mergeTransition(style?.transition, DIMMED_EDGE_TRANSITION),
};
return next;
}
export function projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly,
}: ProjectCanvasFavoritesVisibilityArgs): ProjectCanvasFavoritesVisibilityResult {
const favoriteNodeIds = new Set<string>();
for (const node of nodes) {
if (readNodeFavorite(node.data)) {
favoriteNodeIds.add(node.id);
}
}
const favoriteCount = favoriteNodeIds.size;
const hasFavorites = favoriteCount > 0;
const projectedNodes = nodes.map((node) => {
const shouldDim = favoritesOnly && !favoriteNodeIds.has(node.id);
if (!shouldDim) {
return node;
}
const nextStyle = buildDimmedNodeStyle(node.style);
return shallowEqualRecord(styleToRecord(node.style), styleToRecord(nextStyle))
? node
: {
...node,
style: nextStyle,
};
});
const projectedEdges = edges.map((edge) => {
const isFavoriteEdge = favoriteNodeIds.has(edge.source) && favoriteNodeIds.has(edge.target);
const shouldDim = favoritesOnly && (!hasFavorites || !isFavoriteEdge);
if (!shouldDim) {
return edge;
}
const nextStyle = buildDimmedEdgeStyle(edge.style);
return shallowEqualRecord(styleToRecord(edge.style), styleToRecord(nextStyle))
? edge
: {
...edge,
style: nextStyle,
};
});
return {
nodes: projectedNodes,
edges: projectedEdges,
favoriteNodeIds,
favoriteCount,
};
}

View File

@@ -2,11 +2,14 @@
import { Handle, useConnection } from "@xyflow/react"; import { Handle, useConnection } from "@xyflow/react";
import { HANDLE_SNAP_RADIUS_PX } from "@/components/canvas/canvas-connection-magnetism"; import {
resolveCanvasGlowStrength,
} from "@/components/canvas/canvas-connection-magnetism";
import { useCanvasConnectionMagnetism } from "@/components/canvas/canvas-connection-magnetism-context"; import { useCanvasConnectionMagnetism } from "@/components/canvas/canvas-connection-magnetism-context";
import { import {
canvasHandleAccentColor, canvasHandleAccentColor,
canvasHandleAccentColorWithAlpha, canvasHandleGlowShadow,
type EdgeGlowColorMode,
} from "@/lib/canvas-utils"; } from "@/lib/canvas-utils";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
@@ -34,30 +37,89 @@ export default function CanvasHandle({
const connection = useConnection(); const connection = useConnection();
const { activeTarget } = useCanvasConnectionMagnetism(); const { activeTarget } = useCanvasConnectionMagnetism();
const connectionState = connection as {
inProgress?: boolean;
isValid?: boolean | null;
fromNode?: unknown;
toNode?: unknown;
fromHandle?: unknown;
toHandle?: unknown;
};
const hasConnectionPayload =
connectionState.fromNode !== undefined ||
connectionState.toNode !== undefined ||
connectionState.fromHandle !== undefined ||
connectionState.toHandle !== undefined;
const isConnectionDragActive =
connectionState.inProgress === true ||
(connectionState.inProgress === undefined && hasConnectionPayload);
const handleId = normalizeHandleId(id); const handleId = normalizeHandleId(id);
const targetHandleId = normalizeHandleId(activeTarget?.handleId); const targetHandleId = normalizeHandleId(activeTarget?.handleId);
const toNodeId =
connectionState.toNode &&
typeof connectionState.toNode === "object" &&
"id" in connectionState.toNode &&
typeof (connectionState.toNode as { id?: unknown }).id === "string"
? ((connectionState.toNode as { id: string }).id ?? null)
: null;
const toHandleMeta =
connectionState.toHandle && typeof connectionState.toHandle === "object"
? (connectionState.toHandle as { id?: string | null; type?: "source" | "target" })
: null;
const toHandleId = normalizeHandleId(
toHandleMeta?.id === null ? undefined : toHandleMeta?.id,
);
const toHandleType =
toHandleMeta?.type === "source" || toHandleMeta?.type === "target"
? toHandleMeta.type
: null;
const colorMode: EdgeGlowColorMode =
typeof document !== "undefined" && document.documentElement.classList.contains("dark")
? "dark"
: "light";
const isActiveTarget = const isActiveTarget =
connection.inProgress && isConnectionDragActive &&
activeTarget !== null && activeTarget !== null &&
activeTarget.nodeId === nodeId && activeTarget.nodeId === nodeId &&
activeTarget.handleType === type && activeTarget.handleType === type &&
targetHandleId === handleId; targetHandleId === handleId;
const glowState: "idle" | "near" | "snapped" = isActiveTarget const isNativeHoverTarget =
? activeTarget.distancePx <= HANDLE_SNAP_RADIUS_PX connectionState.inProgress === true &&
? "snapped" toNodeId === nodeId &&
: "near" toHandleType === type &&
: "idle"; toHandleId === handleId;
let glowStrength = 0;
if (isActiveTarget) {
glowStrength = resolveCanvasGlowStrength({
distancePx: activeTarget.distancePx,
});
} else if (isNativeHoverTarget) {
glowStrength = connectionState.isValid === true ? 1 : 0.68;
}
const glowState: "idle" | "near" | "snapped" =
glowStrength <= 0 ? "idle" : glowStrength >= 0.96 ? "snapped" : "near";
const accentColor = canvasHandleAccentColor({ const accentColor = canvasHandleAccentColor({
nodeType, nodeType,
handleId, handleId,
handleType: type, handleType: type,
}); });
const glowAlpha = glowState === "snapped" ? 0.62 : glowState === "near" ? 0.4 : 0; const boxShadow = canvasHandleGlowShadow({
const ringAlpha = glowState === "snapped" ? 0.34 : glowState === "near" ? 0.2 : 0; nodeType,
const glowSize = glowState === "snapped" ? 14 : glowState === "near" ? 10 : 0; handleId,
const ringSize = glowState === "snapped" ? 6 : glowState === "near" ? 4 : 0; handleType: type,
strength: glowStrength,
colorMode,
});
return ( return (
<Handle <Handle
@@ -71,15 +133,14 @@ export default function CanvasHandle({
style={{ style={{
...style, ...style,
backgroundColor: accentColor, backgroundColor: accentColor,
boxShadow: boxShadow,
glowState === "idle"
? undefined
: `0 0 0 ${ringSize}px ${canvasHandleAccentColorWithAlpha({ nodeType, handleId, handleType: type }, ringAlpha)}, 0 0 ${glowSize}px ${canvasHandleAccentColorWithAlpha({ nodeType, handleId, handleType: type }, glowAlpha)}`,
}} }}
data-node-id={nodeId} data-node-id={nodeId}
data-handle-id={id ?? ""} data-handle-id={id ?? ""}
data-handle-type={type} data-handle-type={type}
data-glow-state={glowState} data-glow-state={glowState}
data-glow-strength={glowStrength.toFixed(3)}
data-glow-mode={colorMode}
/> />
); );
} }

View File

@@ -15,8 +15,8 @@ import type { Id } from "@/convex/_generated/dataModel";
const SIDEBAR_DEFAULT_SIZE = "18%"; const SIDEBAR_DEFAULT_SIZE = "18%";
const SIDEBAR_COLLAPSE_THRESHOLD = "10%"; const SIDEBAR_COLLAPSE_THRESHOLD = "10%";
const SIDEBAR_MAX_SIZE = "40%"; const SIDEBAR_MAX_SIZE = "40%";
const SIDEBAR_COLLAPSED_SIZE = "64px"; const SIDEBAR_COLLAPSED_SIZE = "84px";
const SIDEBAR_RAIL_MAX_WIDTH_PX = 112; const SIDEBAR_RAIL_MAX_WIDTH_PX = 148;
const MAIN_PANEL_MIN_SIZE = "40%"; const MAIN_PANEL_MIN_SIZE = "40%";
type CanvasShellProps = { type CanvasShellProps = {

View File

@@ -32,8 +32,6 @@ import {
import { CanvasUserMenu } from "@/components/canvas/canvas-user-menu"; import { CanvasUserMenu } from "@/components/canvas/canvas-user-menu";
import { ProgressiveBlur } from "@/components/ui/progressive-blur"; import { ProgressiveBlur } from "@/components/ui/progressive-blur";
import { useAuthQuery } from "@/hooks/use-auth-query";
import { api } from "@/convex/_generated/api";
import type { Id } from "@/convex/_generated/dataModel"; import type { Id } from "@/convex/_generated/dataModel";
import { import {
NODE_CATEGORY_META, NODE_CATEGORY_META,
@@ -107,14 +105,14 @@ function SidebarRow({
className={cn( className={cn(
"rounded-lg border transition-colors", "rounded-lg border transition-colors",
compact compact
? "flex h-10 w-full items-center justify-center p-0" ? "flex aspect-square min-h-0 w-full items-center justify-center rounded-md p-0"
: "flex items-center gap-2 px-3 py-2 text-sm", : "flex items-center gap-2 px-3 py-2 text-sm",
enabled enabled
? "cursor-grab border-border/80 bg-card hover:bg-accent active:cursor-grabbing" ? "cursor-grab border-border/80 bg-card hover:bg-accent active:cursor-grabbing"
: "cursor-not-allowed border-transparent bg-muted/30 text-muted-foreground", : "cursor-not-allowed border-transparent bg-muted/30 text-muted-foreground",
)} )}
> >
<Icon className="size-4 shrink-0 opacity-80" /> <Icon className={cn("shrink-0 opacity-80", compact ? "size-[1.3rem]" : "size-4")} />
{!compact ? <span className="min-w-0 flex-1 truncate">{entry.label}</span> : null} {!compact ? <span className="min-w-0 flex-1 truncate">{entry.label}</span> : null}
{!compact && entry.phase > 1 ? ( {!compact && entry.phase > 1 ? (
<span className="shrink-0 text-[10px] font-medium tabular-nums text-muted-foreground/80"> <span className="shrink-0 text-[10px] font-medium tabular-nums text-muted-foreground/80">
@@ -135,31 +133,39 @@ export default function CanvasSidebar({
canvasId, canvasId,
railMode = false, railMode = false,
}: CanvasSidebarProps) { }: CanvasSidebarProps) {
const canvas = useAuthQuery(api.canvases.get, { canvasId }); void canvasId;
const byCategory = catalogEntriesByCategory(); const byCategory = catalogEntriesByCategory();
const [collapsedByCategory, setCollapsedByCategory] = useState< const [collapsedByCategory, setCollapsedByCategory] = useState<
Partial<Record<(typeof NODE_CATEGORIES_ORDERED)[number], boolean>> Partial<Record<(typeof NODE_CATEGORIES_ORDERED)[number], boolean>>
>(() => >(() =>
Object.fromEntries( Object.fromEntries(
NODE_CATEGORIES_ORDERED.map((categoryId) => [categoryId, categoryId !== "source"]), NODE_CATEGORIES_ORDERED.map((categoryId) => [categoryId, false]),
), ),
); );
const railEntries = NODE_CATEGORIES_ORDERED.flatMap(
(categoryId) => byCategory.get(categoryId) ?? [],
);
return ( return (
<aside className="flex h-full w-full min-w-0 overflow-hidden flex-col border-r border-border/80 bg-background"> <aside className="flex h-full w-full min-w-0 overflow-hidden flex-col border-r border-border/80 bg-background">
{railMode ? ( {railMode ? (
<div className="border-b border-border/80 px-2 py-3"> <div className="border-b border-border/80 px-2 py-3">
<div className="flex items-center justify-center"> <div className="flex items-center justify-center">
<span <div className="relative">
className="line-clamp-1 text-center text-[11px] font-semibold uppercase tracking-wide text-muted-foreground" <NextImage
title={canvas?.name ?? "Canvas"} src="/logos/lemonspace-logo-v2-black-rgb.svg"
> alt="LemonSpace"
{canvas?.name?.slice(0, 2).toUpperCase() ?? "CV"} width={74}
</span> height={14}
className="h-auto w-[4.625rem] dark:hidden"
priority
/>
<NextImage
src="/logos/lemonspace-logo-v2-white-rgb.svg"
alt="LemonSpace"
width={74}
height={14}
className="hidden h-auto w-[4.625rem] dark:block"
priority
/>
</div>
</div> </div>
</div> </div>
) : ( ) : (
@@ -191,22 +197,41 @@ export default function CanvasSidebar({
<div <div
className={cn( className={cn(
"h-full overflow-y-auto overscroll-contain", "h-full overflow-y-auto overscroll-contain",
railMode ? "p-2 pb-20" : "p-3 pb-28", railMode ? "px-2 py-2 pb-20" : "p-3 pb-28",
)} )}
> >
{railMode ? ( {railMode ? (
<div className="flex flex-col gap-1.5"> <div className="flex flex-col gap-3">
{railEntries.map((entry) => ( {NODE_CATEGORIES_ORDERED.map((categoryId, index) => {
const entries = byCategory.get(categoryId) ?? [];
if (entries.length === 0) return null;
return (
<section
key={categoryId}
className={cn("space-y-1.5", index === 0 ? "pt-0" : "border-t border-border/70 pt-2")}
>
{index > 0 ? (
<div data-testid={`sidebar-rail-category-${categoryId}-divider`} aria-hidden="true" />
) : null}
<div
data-testid={`sidebar-rail-category-${categoryId}-grid`}
className="grid grid-cols-2 gap-1"
>
{entries.map((entry) => (
<SidebarRow key={entry.type} entry={entry} compact /> <SidebarRow key={entry.type} entry={entry} compact />
))} ))}
</div> </div>
</section>
);
})}
</div>
) : ( ) : (
<> <>
{NODE_CATEGORIES_ORDERED.map((categoryId) => { {NODE_CATEGORIES_ORDERED.map((categoryId) => {
const entries = byCategory.get(categoryId) ?? []; const entries = byCategory.get(categoryId) ?? [];
if (entries.length === 0) return null; if (entries.length === 0) return null;
const { label } = NODE_CATEGORY_META[categoryId]; const { label } = NODE_CATEGORY_META[categoryId];
const isCollapsed = collapsedByCategory[categoryId] ?? categoryId !== "source"; const isCollapsed = collapsedByCategory[categoryId] ?? false;
return ( return (
<div key={categoryId} className="mb-4 last:mb-0"> <div key={categoryId} className="mb-4 last:mb-0">
<button <button
@@ -214,7 +239,7 @@ export default function CanvasSidebar({
onClick={() => onClick={() =>
setCollapsedByCategory((prev) => ({ setCollapsedByCategory((prev) => ({
...prev, ...prev,
[categoryId]: !(prev[categoryId] ?? categoryId !== "source"), [categoryId]: !(prev[categoryId] ?? false),
})) }))
} }
className="mb-2 flex w-full items-center justify-between rounded-md px-0.5 py-1 text-left text-xs font-medium uppercase tracking-wide text-muted-foreground transition-colors hover:bg-muted/40 hover:text-foreground" className="mb-2 flex w-full items-center justify-between rounded-md px-0.5 py-1 text-left text-xs font-medium uppercase tracking-wide text-muted-foreground transition-colors hover:bg-muted/40 hover:text-foreground"

View File

@@ -8,6 +8,7 @@ import {
Plus, Plus,
Redo2, Redo2,
Scissors, Scissors,
Star,
Undo2, Undo2,
} from "lucide-react"; } from "lucide-react";
@@ -40,12 +41,18 @@ interface CanvasToolbarProps {
canvasName?: string; canvasName?: string;
activeTool: CanvasNavTool; activeTool: CanvasNavTool;
onToolChange: (tool: CanvasNavTool) => void; onToolChange: (tool: CanvasNavTool) => void;
favoriteFilterActive?: boolean;
onFavoriteFilterChange?: (active: boolean) => void;
favoriteCount?: number;
} }
export default function CanvasToolbar({ export default function CanvasToolbar({
canvasName, canvasName,
activeTool, activeTool,
onToolChange, onToolChange,
favoriteFilterActive = false,
onFavoriteFilterChange,
favoriteCount,
}: CanvasToolbarProps) { }: CanvasToolbarProps) {
const { createNodeWithIntersection } = useCanvasPlacement(); const { createNodeWithIntersection } = useCanvasPlacement();
const getCenteredPosition = useCenteredFlowNodePosition(); const getCenteredPosition = useCenteredFlowNodePosition();
@@ -66,6 +73,10 @@ export default function CanvasToolbar({
const byCategory = catalogEntriesByCategory(); const byCategory = catalogEntriesByCategory();
const resolvedCanvasName = canvasName?.trim() || "Unbenannter Canvas"; const resolvedCanvasName = canvasName?.trim() || "Unbenannter Canvas";
const favoritesLabel =
typeof favoriteCount === "number"
? `Favoriten hervorheben (${favoriteCount})`
: "Favoriten hervorheben";
const toolBtn = (tool: CanvasNavTool, icon: React.ReactNode, label: string) => ( const toolBtn = (tool: CanvasNavTool, icon: React.ReactNode, label: string) => (
<Button <Button
@@ -144,6 +155,21 @@ export default function CanvasToolbar({
)} )}
{toolBtn("scissor", <Scissors className="size-4" />, "Schere (K) — Verbindungen kappen")} {toolBtn("scissor", <Scissors className="size-4" />, "Schere (K) — Verbindungen kappen")}
{onFavoriteFilterChange ? (
<Button
type="button"
size="icon"
variant={favoriteFilterActive ? "secondary" : "ghost"}
className="size-9 shrink-0"
aria-label={favoritesLabel}
title={favoritesLabel}
aria-pressed={favoriteFilterActive}
onClick={() => onFavoriteFilterChange(!favoriteFilterActive)}
>
<Star className={favoriteFilterActive ? "size-4 fill-current" : "size-4"} />
</Button>
) : null}
<Button <Button
type="button" type="button"
size="icon" size="icon"

View File

@@ -78,7 +78,9 @@ import { useCanvasEdgeTypes } from "./use-canvas-edge-types";
import { useCanvasFlowReconciliation } from "./use-canvas-flow-reconciliation"; import { useCanvasFlowReconciliation } from "./use-canvas-flow-reconciliation";
import { useCanvasLocalSnapshotPersistence } from "./use-canvas-local-snapshot-persistence"; import { useCanvasLocalSnapshotPersistence } from "./use-canvas-local-snapshot-persistence";
import { useCanvasSyncEngine } from "./use-canvas-sync-engine"; import { useCanvasSyncEngine } from "./use-canvas-sync-engine";
import { HANDLE_GLOW_RADIUS_PX } from "./canvas-connection-magnetism";
import { CanvasConnectionMagnetismProvider } from "./canvas-connection-magnetism-context"; import { CanvasConnectionMagnetismProvider } from "./canvas-connection-magnetism-context";
import { projectCanvasFavoritesVisibility } from "./canvas-favorites-visibility";
interface CanvasInnerProps { interface CanvasInnerProps {
canvasId: Id<"canvases">; canvasId: Id<"canvases">;
@@ -167,6 +169,7 @@ function CanvasInner({ canvasId }: CanvasInnerProps) {
{ x: number; y: number }[] | null { x: number; y: number }[] | null
>(null); >(null);
const [navTool, setNavTool] = useState<CanvasNavTool>("select"); const [navTool, setNavTool] = useState<CanvasNavTool>("select");
const [focusFavorites, setFocusFavorites] = useState(false);
useCanvasLocalSnapshotPersistence<RFNode, RFEdge>({ useCanvasLocalSnapshotPersistence<RFNode, RFEdge>({
canvasId: canvasId as string, canvasId: canvasId as string,
@@ -207,6 +210,16 @@ function CanvasInner({ canvasId }: CanvasInnerProps) {
[edges], [edges],
); );
const favoriteProjection = useMemo(
() =>
projectCanvasFavoritesVisibility({
nodes,
edges,
favoritesOnly: focusFavorites,
}),
[edges, nodes, focusFavorites],
);
const pendingRemovedEdgeIds = useMemo( const pendingRemovedEdgeIds = useMemo(
() => { () => {
void convexEdges; void convexEdges;
@@ -581,6 +594,9 @@ function CanvasInner({ canvasId }: CanvasInnerProps) {
canvasName={canvas?.name} canvasName={canvas?.name}
activeTool={navTool} activeTool={navTool}
onToolChange={handleNavToolChange} onToolChange={handleNavToolChange}
favoriteFilterActive={focusFavorites}
onFavoriteFilterChange={setFocusFavorites}
favoriteCount={favoriteProjection.favoriteCount}
/> />
<CanvasAppMenu canvasId={canvasId} /> <CanvasAppMenu canvasId={canvasId} />
<CanvasCommandPalette /> <CanvasCommandPalette />
@@ -642,8 +658,8 @@ function CanvasInner({ canvasId }: CanvasInnerProps) {
<CanvasGraphProvider nodes={canvasGraphNodes} edges={canvasGraphEdges}> <CanvasGraphProvider nodes={canvasGraphNodes} edges={canvasGraphEdges}>
<ReactFlow <ReactFlow
style={edgeInsertReflowStyle} style={edgeInsertReflowStyle}
nodes={nodes} nodes={favoriteProjection.nodes}
edges={edges} edges={favoriteProjection.edges}
onlyRenderVisibleElements onlyRenderVisibleElements
defaultEdgeOptions={defaultEdgeOptions} defaultEdgeOptions={defaultEdgeOptions}
connectionLineComponent={CustomConnectionLine} connectionLineComponent={CustomConnectionLine}
@@ -676,6 +692,9 @@ function CanvasInner({ canvasId }: CanvasInnerProps) {
panOnDrag={flowPanOnDrag} panOnDrag={flowPanOnDrag}
selectionOnDrag={flowSelectionOnDrag} selectionOnDrag={flowSelectionOnDrag}
panActivationKeyCode="Space" panActivationKeyCode="Space"
connectionRadius={HANDLE_GLOW_RADIUS_PX}
reconnectRadius={24}
edgesReconnectable
proOptions={{ hideAttribution: true }} proOptions={{ hideAttribution: true }}
colorMode={resolvedTheme === "dark" ? "dark" : "light"} colorMode={resolvedTheme === "dark" ? "dark" : "light"}
className={cn( className={cn(

View File

@@ -7,16 +7,22 @@ import {
getSmoothStepPath, getSmoothStepPath,
getStraightPath, getStraightPath,
type ConnectionLineComponentProps, type ConnectionLineComponentProps,
useConnection,
useReactFlow, useReactFlow,
} from "@xyflow/react"; } from "@xyflow/react";
import { useEffect, useMemo } from "react"; import { useEffect, useMemo } from "react";
import { import {
HANDLE_SNAP_RADIUS_PX, HANDLE_SNAP_RADIUS_PX,
resolveCanvasGlowStrength,
resolveCanvasMagnetTarget, resolveCanvasMagnetTarget,
} from "@/components/canvas/canvas-connection-magnetism"; } from "@/components/canvas/canvas-connection-magnetism";
import { useCanvasConnectionMagnetism } from "@/components/canvas/canvas-connection-magnetism-context"; import { useCanvasConnectionMagnetism } from "@/components/canvas/canvas-connection-magnetism-context";
import { connectionLineAccentRgb } from "@/lib/canvas-utils"; import {
connectionLineAccentRgb,
connectionLineGlowFilter,
type EdgeGlowColorMode,
} from "@/lib/canvas-utils";
function hasSameMagnetTarget( function hasSameMagnetTarget(
a: Parameters<ReturnType<typeof useCanvasConnectionMagnetism>["setActiveTarget"]>[0], a: Parameters<ReturnType<typeof useCanvasConnectionMagnetism>["setActiveTarget"]>[0],
@@ -50,31 +56,43 @@ export default function CustomConnectionLine({
fromPosition, fromPosition,
toPosition, toPosition,
connectionStatus, connectionStatus,
pointer,
}: ConnectionLineComponentProps) { }: ConnectionLineComponentProps) {
const { getNodes, getEdges } = useReactFlow(); const { getNodes, getEdges, screenToFlowPosition } = useReactFlow();
const connection = useConnection();
const { activeTarget, setActiveTarget } = useCanvasConnectionMagnetism(); const { activeTarget, setActiveTarget } = useCanvasConnectionMagnetism();
const fromHandleId = fromHandle?.id; const fromHandleId = fromHandle?.id;
const fromNodeId = fromNode?.id; const fromNodeId = fromNode?.id;
const connectionFromHandleType =
connection.fromHandle?.type === "source" || connection.fromHandle?.type === "target"
? connection.fromHandle.type
: null;
const fromHandleType = const fromHandleType =
fromHandle?.type === "source" || fromHandle?.type === "target" fromHandle?.type === "source" || fromHandle?.type === "target"
? fromHandle.type ? fromHandle.type
: null; : connectionFromHandleType ?? "source";
const resolvedMagnetTarget = useMemo(() => { const resolvedMagnetTarget = useMemo(() => {
if (!fromHandleType || !fromNodeId) { if (!fromHandleType || !fromNodeId) {
return null; return null;
} }
const magnetPoint =
pointer && Number.isFinite(pointer.x) && Number.isFinite(pointer.y)
? { x: pointer.x, y: pointer.y }
: { x: toX, y: toY };
return resolveCanvasMagnetTarget({ return resolveCanvasMagnetTarget({
point: { x: toX, y: toY }, point: magnetPoint,
fromNodeId, fromNodeId,
fromHandleId: fromHandleId ?? undefined, fromHandleId: fromHandleId ?? undefined,
fromHandleType, fromHandleType,
nodes: getNodes(), nodes: getNodes(),
edges: getEdges(), edges: getEdges(),
}); });
}, [fromHandleId, fromHandleType, fromNodeId, getEdges, getNodes, toX, toY]); }, [fromHandleId, fromHandleType, fromNodeId, getEdges, getNodes, pointer, toX, toY]);
useEffect(() => { useEffect(() => {
if (hasSameMagnetTarget(activeTarget, resolvedMagnetTarget)) { if (hasSameMagnetTarget(activeTarget, resolvedMagnetTarget)) {
@@ -84,12 +102,21 @@ export default function CustomConnectionLine({
}, [activeTarget, resolvedMagnetTarget, setActiveTarget]); }, [activeTarget, resolvedMagnetTarget, setActiveTarget]);
const magnetTarget = activeTarget ?? resolvedMagnetTarget; const magnetTarget = activeTarget ?? resolvedMagnetTarget;
const glowStrength = magnetTarget
? resolveCanvasGlowStrength({
distancePx: magnetTarget.distancePx,
})
: 0;
const snappedTarget = const snappedTarget =
magnetTarget && magnetTarget.distancePx <= HANDLE_SNAP_RADIUS_PX magnetTarget && magnetTarget.distancePx <= HANDLE_SNAP_RADIUS_PX
? magnetTarget ? magnetTarget
: null; : null;
const targetX = snappedTarget?.centerX ?? toX; const snappedFlowPoint =
const targetY = snappedTarget?.centerY ?? toY; snappedTarget === null
? null
: screenToFlowPosition({ x: snappedTarget.centerX, y: snappedTarget.centerY });
const targetX = snappedFlowPoint?.x ?? toX;
const targetY = snappedFlowPoint?.y ?? toY;
const pathParams = { const pathParams = {
sourceX: fromX, sourceX: fromX,
@@ -123,10 +150,17 @@ export default function CustomConnectionLine({
const [r, g, b] = connectionLineAccentRgb(fromNode.type, fromHandleId); const [r, g, b] = connectionLineAccentRgb(fromNode.type, fromHandleId);
const opacity = connectionStatus === "invalid" ? 0.45 : 1; const opacity = connectionStatus === "invalid" ? 0.45 : 1;
const strokeWidth = snappedTarget ? 3.25 : 2.5; const colorMode: EdgeGlowColorMode =
const filter = snappedTarget typeof document !== "undefined" && document.documentElement.classList.contains("dark")
? `drop-shadow(0 0 3px rgba(${r}, ${g}, ${b}, 0.7)) drop-shadow(0 0 8px rgba(${r}, ${g}, ${b}, 0.48))` ? "dark"
: undefined; : "light";
const strokeWidth = 2.5 + glowStrength * 0.75;
const filter = connectionLineGlowFilter({
nodeType: fromNode.type,
handleId: fromHandleId,
strength: glowStrength,
colorMode,
});
return ( return (
<path <path

View File

@@ -5,10 +5,26 @@ import {
BaseEdge, BaseEdge,
EdgeLabelRenderer, EdgeLabelRenderer,
getBezierPath, getBezierPath,
useViewport,
type EdgeProps, type EdgeProps,
} from "@xyflow/react"; } from "@xyflow/react";
import { Plus } from "lucide-react"; import { Plus } from "lucide-react";
const MIN_EDGE_INSERT_BUTTON_SCALE = 0.95;
const MAX_EDGE_INSERT_BUTTON_SCALE = 2.2;
function getEdgeInsertButtonScale(zoom: number): number {
if (!Number.isFinite(zoom) || zoom <= 0) {
return 1;
}
const inverseZoom = 1 / zoom;
return Math.min(
MAX_EDGE_INSERT_BUTTON_SCALE,
Math.max(MIN_EDGE_INSERT_BUTTON_SCALE, inverseZoom),
);
}
export type DefaultEdgeInsertAnchor = { export type DefaultEdgeInsertAnchor = {
edgeId: string; edgeId: string;
screenX: number; screenX: number;
@@ -41,6 +57,7 @@ export default function DefaultEdge({
}: DefaultEdgeProps) { }: DefaultEdgeProps) {
const [isEdgeHovered, setIsEdgeHovered] = useState(false); const [isEdgeHovered, setIsEdgeHovered] = useState(false);
const [isButtonHovered, setIsButtonHovered] = useState(false); const [isButtonHovered, setIsButtonHovered] = useState(false);
const { zoom } = useViewport();
const [edgePath, labelX, labelY] = useMemo( const [edgePath, labelX, labelY] = useMemo(
() => () =>
@@ -58,6 +75,7 @@ export default function DefaultEdge({
const resolvedEdgeId = edgeId ?? id; const resolvedEdgeId = edgeId ?? id;
const canInsert = Boolean(onInsertClick) && !disabled; const canInsert = Boolean(onInsertClick) && !disabled;
const isInsertVisible = canInsert && (isMenuOpen || isEdgeHovered || isButtonHovered); const isInsertVisible = canInsert && (isMenuOpen || isEdgeHovered || isButtonHovered);
const buttonScale = getEdgeInsertButtonScale(zoom);
const handleInsertClick = (event: MouseEvent<HTMLButtonElement>) => { const handleInsertClick = (event: MouseEvent<HTMLButtonElement>) => {
if (!onInsertClick || disabled) { if (!onInsertClick || disabled) {
@@ -97,9 +115,10 @@ export default function DefaultEdge({
aria-label="Insert node" aria-label="Insert node"
aria-hidden={!isInsertVisible} aria-hidden={!isInsertVisible}
disabled={!canInsert} disabled={!canInsert}
className="nodrag nopan absolute h-7 w-7 items-center justify-center rounded-full border border-border bg-background text-foreground shadow-sm transition-opacity" className="nodrag nopan absolute h-8 w-8 items-center justify-center rounded-full border-2 border-border/90 bg-background/95 text-foreground shadow-[0_2px_10px_rgba(0,0,0,0.28)] ring-1 ring-background/90 transition-opacity transition-shadow"
style={{ style={{
transform: `translate(-50%, -50%) translate(${labelX}px, ${labelY}px)`, transform: `translate(-50%, -50%) translate(${labelX}px, ${labelY}px) scale(${buttonScale})`,
transformOrigin: "center center",
opacity: isInsertVisible ? 1 : 0, opacity: isInsertVisible ? 1 : 0,
pointerEvents: isInsertVisible ? "all" : "none", pointerEvents: isInsertVisible ? "all" : "none",
display: "flex", display: "flex",
@@ -108,7 +127,7 @@ export default function DefaultEdge({
onMouseLeave={() => setIsButtonHovered(false)} onMouseLeave={() => setIsButtonHovered(false)}
onClick={handleInsertClick} onClick={handleInsertClick}
> >
<Plus className="h-4 w-4" aria-hidden="true" /> <Plus className="h-[18px] w-[18px] stroke-[2.5]" aria-hidden="true" />
</button> </button>
</EdgeLabelRenderer> </EdgeLabelRenderer>
</> </>

View File

@@ -1,6 +1,6 @@
"use client"; "use client";
import { useCallback, useMemo, useRef, useState } from "react"; import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import { Position, type NodeProps } from "@xyflow/react"; import { Position, type NodeProps } from "@xyflow/react";
import { ImageIcon } from "lucide-react"; import { ImageIcon } from "lucide-react";
import BaseNodeWrapper from "./base-node-wrapper"; import BaseNodeWrapper from "./base-node-wrapper";
@@ -36,12 +36,18 @@ type CompareSideState = {
type CompareDisplayMode = "render" | "preview"; type CompareDisplayMode = "render" | "preview";
export default function CompareNode({ id, data, selected, width }: NodeProps) { type CompareSurfaceSize = {
width: number;
height: number;
};
export default function CompareNode({ id, data, selected, width, height }: NodeProps) {
const nodeData = data as CompareNodeData; const nodeData = data as CompareNodeData;
const graph = useCanvasGraph(); const graph = useCanvasGraph();
const [sliderX, setSliderX] = useState(50); const [sliderX, setSliderX] = useState(50);
const [manualDisplayMode, setManualDisplayMode] = useState<CompareDisplayMode | null>(null); const [manualDisplayMode, setManualDisplayMode] = useState<CompareDisplayMode | null>(null);
const containerRef = useRef<HTMLDivElement>(null); const containerRef = useRef<HTMLDivElement>(null);
const [surfaceSize, setSurfaceSize] = useState<CompareSurfaceSize | null>(null);
const incomingEdges = useMemo( const incomingEdges = useMemo(
() => graph.incomingEdgesByTarget.get(id) ?? [], () => graph.incomingEdgesByTarget.get(id) ?? [],
[graph, id], [graph, id],
@@ -74,8 +80,14 @@ export default function CompareNode({ id, data, selected, width }: NodeProps) {
graph, graph,
}); });
if (preview.sourceUrl) { if (preview.sourceUrl || preview.sourceComposition) {
previewInput = { previewInput = preview.sourceComposition
? {
sourceUrl: null,
sourceComposition: preview.sourceComposition,
steps: preview.steps,
}
: {
sourceUrl: preview.sourceUrl, sourceUrl: preview.sourceUrl,
steps: preview.steps, steps: preview.steps,
}; };
@@ -92,6 +104,7 @@ export default function CompareNode({ id, data, selected, width }: NodeProps) {
sourceLastUploadedHash ?? sourceLastRenderedHash; sourceLastUploadedHash ?? sourceLastRenderedHash;
const sourceCurrentHash = resolveRenderPipelineHash({ const sourceCurrentHash = resolveRenderPipelineHash({
sourceUrl: preview.sourceUrl, sourceUrl: preview.sourceUrl,
sourceComposition: preview.sourceComposition,
steps: preview.steps, steps: preview.steps,
data: sourceData, data: sourceData,
}); });
@@ -173,7 +186,60 @@ export default function CompareNode({ id, data, selected, width }: NodeProps) {
resolvedSides.right.isStaleRenderOutput; resolvedSides.right.isStaleRenderOutput;
const effectiveDisplayMode = const effectiveDisplayMode =
manualDisplayMode ?? (shouldDefaultToPreview ? "preview" : "render"); manualDisplayMode ?? (shouldDefaultToPreview ? "preview" : "render");
const previewNodeWidth = Math.max(240, Math.min(640, Math.round(width ?? 500))); const fallbackSurfaceWidth = Math.max(240, Math.min(640, Math.round(width ?? 500)));
const fallbackSurfaceHeight = Math.max(180, Math.min(720, Math.round(height ?? 380)));
const previewNodeWidth = Math.max(
1,
Math.round(surfaceSize?.width ?? fallbackSurfaceWidth),
);
const previewNodeHeight = Math.max(
1,
Math.round(surfaceSize?.height ?? fallbackSurfaceHeight),
);
useEffect(() => {
const surfaceElement = containerRef.current;
if (!surfaceElement) {
return;
}
const updateSurfaceSize = (nextWidth: number, nextHeight: number) => {
const roundedWidth = Math.max(1, Math.round(nextWidth));
const roundedHeight = Math.max(1, Math.round(nextHeight));
setSurfaceSize((current) =>
current?.width === roundedWidth && current?.height === roundedHeight
? current
: {
width: roundedWidth,
height: roundedHeight,
},
);
};
const measureSurface = () => {
const rect = surfaceElement.getBoundingClientRect();
updateSurfaceSize(rect.width, rect.height);
};
measureSurface();
if (typeof ResizeObserver === "undefined") {
return undefined;
}
const observer = new ResizeObserver((entries) => {
const entry = entries[0];
if (!entry) {
return;
}
updateSurfaceSize(entry.contentRect.width, entry.contentRect.height);
});
observer.observe(surfaceElement);
return () => observer.disconnect();
}, []);
const setSliderPercent = useCallback((value: number) => { const setSliderPercent = useCallback((value: number) => {
setSliderX(Math.max(0, Math.min(100, value))); setSliderX(Math.max(0, Math.min(100, value)));
@@ -321,6 +387,7 @@ export default function CompareNode({ id, data, selected, width }: NodeProps) {
previewInput={resolvedSides.right.previewInput} previewInput={resolvedSides.right.previewInput}
mixerPreviewState={resolvedSides.right.mixerPreviewState} mixerPreviewState={resolvedSides.right.mixerPreviewState}
nodeWidth={previewNodeWidth} nodeWidth={previewNodeWidth}
nodeHeight={previewNodeHeight}
preferPreview={effectiveDisplayMode === "preview"} preferPreview={effectiveDisplayMode === "preview"}
/> />
)} )}
@@ -332,6 +399,7 @@ export default function CompareNode({ id, data, selected, width }: NodeProps) {
previewInput={resolvedSides.left.previewInput} previewInput={resolvedSides.left.previewInput}
mixerPreviewState={resolvedSides.left.mixerPreviewState} mixerPreviewState={resolvedSides.left.mixerPreviewState}
nodeWidth={previewNodeWidth} nodeWidth={previewNodeWidth}
nodeHeight={previewNodeHeight}
clipWidthPercent={sliderX} clipWidthPercent={sliderX}
preferPreview={effectiveDisplayMode === "preview"} preferPreview={effectiveDisplayMode === "preview"}
/> />

View File

@@ -1,5 +1,7 @@
"use client"; "use client";
import { useState } from "react";
import { useCanvasGraph } from "@/components/canvas/canvas-graph-context"; import { useCanvasGraph } from "@/components/canvas/canvas-graph-context";
import { usePipelinePreview } from "@/hooks/use-pipeline-preview"; import { usePipelinePreview } from "@/hooks/use-pipeline-preview";
import { import {
@@ -7,8 +9,20 @@ import {
type RenderPreviewInput, type RenderPreviewInput,
} from "@/lib/canvas-render-preview"; } from "@/lib/canvas-render-preview";
import type { MixerPreviewState } from "@/lib/canvas-mixer-preview"; import type { MixerPreviewState } from "@/lib/canvas-mixer-preview";
import {
computeMixerCompareOverlayImageStyle,
computeMixerFrameRectInSurface,
isMixerCropImageReady,
} from "@/lib/mixer-crop-layout";
const EMPTY_STEPS: RenderPreviewInput["steps"] = []; const EMPTY_STEPS: RenderPreviewInput["steps"] = [];
const ZERO_SIZE = { width: 0, height: 0 };
type LoadedImageState = {
url: string | null;
width: number;
height: number;
};
type CompareSurfaceProps = { type CompareSurfaceProps = {
finalUrl?: string; finalUrl?: string;
@@ -16,6 +30,7 @@ type CompareSurfaceProps = {
previewInput?: RenderPreviewInput; previewInput?: RenderPreviewInput;
mixerPreviewState?: MixerPreviewState; mixerPreviewState?: MixerPreviewState;
nodeWidth: number; nodeWidth: number;
nodeHeight: number;
clipWidthPercent?: number; clipWidthPercent?: number;
preferPreview?: boolean; preferPreview?: boolean;
}; };
@@ -26,12 +41,22 @@ export default function CompareSurface({
previewInput, previewInput,
mixerPreviewState, mixerPreviewState,
nodeWidth, nodeWidth,
nodeHeight,
clipWidthPercent, clipWidthPercent,
preferPreview, preferPreview,
}: CompareSurfaceProps) { }: CompareSurfaceProps) {
const graph = useCanvasGraph(); const graph = useCanvasGraph();
const [baseImageState, setBaseImageState] = useState<LoadedImageState>({
url: null,
...ZERO_SIZE,
});
const [overlayImageState, setOverlayImageState] = useState<LoadedImageState>({
url: null,
...ZERO_SIZE,
});
const usePreview = Boolean(previewInput && (preferPreview || !finalUrl)); const usePreview = Boolean(previewInput && (preferPreview || !finalUrl));
const previewSourceUrl = usePreview ? previewInput?.sourceUrl ?? null : null; const previewSourceUrl = usePreview ? previewInput?.sourceUrl ?? null : null;
const previewSourceComposition = usePreview ? previewInput?.sourceComposition : undefined;
const previewSteps = usePreview ? previewInput?.steps ?? EMPTY_STEPS : EMPTY_STEPS; const previewSteps = usePreview ? previewInput?.steps ?? EMPTY_STEPS : EMPTY_STEPS;
const visibleFinalUrl = usePreview ? undefined : finalUrl; const visibleFinalUrl = usePreview ? undefined : finalUrl;
const previewDebounceMs = shouldFastPathPreviewPipeline( const previewDebounceMs = shouldFastPathPreviewPipeline(
@@ -43,6 +68,7 @@ export default function CompareSurface({
const { canvasRef, isRendering, error } = usePipelinePreview({ const { canvasRef, isRendering, error } = usePipelinePreview({
sourceUrl: previewSourceUrl, sourceUrl: previewSourceUrl,
sourceComposition: previewSourceComposition,
steps: previewSteps, steps: previewSteps,
nodeWidth, nodeWidth,
includeHistogram: false, includeHistogram: false,
@@ -64,6 +90,35 @@ export default function CompareSurface({
} }
: undefined; : undefined;
const baseNaturalSize =
mixerPreviewState?.baseUrl && mixerPreviewState.baseUrl === baseImageState.url
? { width: baseImageState.width, height: baseImageState.height }
: ZERO_SIZE;
const overlayNaturalSize =
mixerPreviewState?.overlayUrl && mixerPreviewState.overlayUrl === overlayImageState.url
? { width: overlayImageState.width, height: overlayImageState.height }
: ZERO_SIZE;
const mixerCropReady = isMixerCropImageReady({
currentOverlayUrl: mixerPreviewState?.overlayUrl,
loadedOverlayUrl: overlayImageState.url,
sourceWidth: overlayNaturalSize.width,
sourceHeight: overlayNaturalSize.height,
});
const mixerFrameRect = hasMixerPreview
? computeMixerFrameRectInSurface({
surfaceWidth: nodeWidth,
surfaceHeight: nodeHeight,
baseWidth: baseNaturalSize.width,
baseHeight: baseNaturalSize.height,
overlayX: mixerPreviewState.overlayX,
overlayY: mixerPreviewState.overlayY,
overlayWidth: mixerPreviewState.overlayWidth,
overlayHeight: mixerPreviewState.overlayHeight,
fit: "contain",
})
: null;
return ( return (
<div className="pointer-events-none absolute inset-0" style={clipStyle}> <div className="pointer-events-none absolute inset-0" style={clipStyle}>
{visibleFinalUrl ? ( {visibleFinalUrl ? (
@@ -87,19 +142,62 @@ export default function CompareSurface({
alt={label ?? "Comparison image"} alt={label ?? "Comparison image"}
className="absolute inset-0 h-full w-full object-contain" className="absolute inset-0 h-full w-full object-contain"
draggable={false} draggable={false}
onLoad={(event) => {
setBaseImageState({
url: event.currentTarget.currentSrc || event.currentTarget.src,
width: event.currentTarget.naturalWidth,
height: event.currentTarget.naturalHeight,
});
}}
/> />
{mixerFrameRect ? (
<div
className="absolute overflow-hidden"
style={{
mixBlendMode: mixerPreviewState.blendMode,
opacity: mixerPreviewState.opacity / 100,
left: `${mixerFrameRect.x * 100}%`,
top: `${mixerFrameRect.y * 100}%`,
width: `${mixerFrameRect.width * 100}%`,
height: `${mixerFrameRect.height * 100}%`,
}}
>
{/* eslint-disable-next-line @next/next/no-img-element */} {/* eslint-disable-next-line @next/next/no-img-element */}
<img <img
src={mixerPreviewState.overlayUrl} src={mixerPreviewState.overlayUrl}
alt={label ?? "Comparison image"} alt={label ?? "Comparison image"}
className="absolute inset-0 h-full w-full object-contain" className="absolute max-w-none"
draggable={false} draggable={false}
style={{ onLoad={(event) => {
mixBlendMode: mixerPreviewState.blendMode, setOverlayImageState({
opacity: mixerPreviewState.opacity / 100, url: event.currentTarget.currentSrc || event.currentTarget.src,
transform: `translate(${mixerPreviewState.offsetX}px, ${mixerPreviewState.offsetY}px)`, width: event.currentTarget.naturalWidth,
height: event.currentTarget.naturalHeight,
});
}} }}
style={
mixerCropReady
? computeMixerCompareOverlayImageStyle({
surfaceWidth: nodeWidth,
surfaceHeight: nodeHeight,
baseWidth: baseNaturalSize.width,
baseHeight: baseNaturalSize.height,
overlayX: mixerPreviewState.overlayX,
overlayY: mixerPreviewState.overlayY,
overlayWidth: mixerPreviewState.overlayWidth,
overlayHeight: mixerPreviewState.overlayHeight,
sourceWidth: overlayNaturalSize.width,
sourceHeight: overlayNaturalSize.height,
cropLeft: mixerPreviewState.cropLeft,
cropTop: mixerPreviewState.cropTop,
cropRight: mixerPreviewState.cropRight,
cropBottom: mixerPreviewState.cropBottom,
})
: { visibility: "hidden" }
}
/> />
</div>
) : null}
</> </>
) : null} ) : null}

File diff suppressed because it is too large Load Diff

View File

@@ -464,11 +464,13 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
); );
const sourceUrl = renderPreviewInput.sourceUrl; const sourceUrl = renderPreviewInput.sourceUrl;
const sourceComposition = renderPreviewInput.sourceComposition;
useEffect(() => { useEffect(() => {
logRenderDebug("node-data-updated", { logRenderDebug("node-data-updated", {
nodeId: id, nodeId: id,
hasSourceUrl: typeof sourceUrl === "string" && sourceUrl.length > 0, hasSourceUrl: typeof sourceUrl === "string" && sourceUrl.length > 0,
hasSourceComposition: Boolean(sourceComposition),
storageId: data.storageId ?? null, storageId: data.storageId ?? null,
lastUploadStorageId: data.lastUploadStorageId ?? null, lastUploadStorageId: data.lastUploadStorageId ?? null,
hasResolvedUrl: typeof data.url === "string" && data.url.length > 0, hasResolvedUrl: typeof data.url === "string" && data.url.length > 0,
@@ -485,6 +487,7 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
data.url, data.url,
id, id,
sourceUrl, sourceUrl,
sourceComposition,
]); ]);
const sourceNode = useMemo<SourceNodeDescriptor | null>( const sourceNode = useMemo<SourceNodeDescriptor | null>(
@@ -526,9 +529,12 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
); );
const currentPipelineHash = useMemo(() => { const currentPipelineHash = useMemo(() => {
if (!sourceUrl) return null; if (!sourceUrl && !sourceComposition) return null;
return hashPipeline({ sourceUrl, render: renderFingerprint }, steps); return hashPipeline(
}, [renderFingerprint, sourceUrl, steps]); { source: sourceComposition ?? sourceUrl, render: renderFingerprint },
steps,
);
}, [renderFingerprint, sourceComposition, sourceUrl, steps]);
const isRenderCurrent = const isRenderCurrent =
Boolean(currentPipelineHash) && localData.lastRenderedHash === currentPipelineHash; Boolean(currentPipelineHash) && localData.lastRenderedHash === currentPipelineHash;
@@ -558,7 +564,8 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
error: "Error", error: "Error",
}; };
const hasSource = typeof sourceUrl === "string" && sourceUrl.length > 0; const hasSource =
(typeof sourceUrl === "string" && sourceUrl.length > 0) || Boolean(sourceComposition);
const previewNodeWidth = Math.max(260, Math.round(width ?? 320)); const previewNodeWidth = Math.max(260, Math.round(width ?? 320));
const { const {
@@ -569,6 +576,7 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
error: previewError, error: previewError,
} = usePipelinePreview({ } = usePipelinePreview({
sourceUrl, sourceUrl,
sourceComposition,
steps, steps,
nodeWidth: previewNodeWidth, nodeWidth: previewNodeWidth,
debounceMs: previewDebounceMs, debounceMs: previewDebounceMs,
@@ -586,6 +594,7 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
error: fullscreenPreviewError, error: fullscreenPreviewError,
} = usePipelinePreview({ } = usePipelinePreview({
sourceUrl: isFullscreenOpen && sourceUrl ? sourceUrl : null, sourceUrl: isFullscreenOpen && sourceUrl ? sourceUrl : null,
sourceComposition: isFullscreenOpen ? sourceComposition : undefined,
steps, steps,
nodeWidth: fullscreenPreviewWidth, nodeWidth: fullscreenPreviewWidth,
includeHistogram: false, includeHistogram: false,
@@ -720,11 +729,12 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
}; };
const handleRender = async (mode: "download" | "upload") => { const handleRender = async (mode: "download" | "upload") => {
if (!sourceUrl || !currentPipelineHash) { if ((!sourceUrl && !sourceComposition) || !currentPipelineHash) {
logRenderDebug("render-aborted-prerequisites", { logRenderDebug("render-aborted-prerequisites", {
nodeId: id, nodeId: id,
mode, mode,
hasSourceUrl: Boolean(sourceUrl), hasSourceUrl: Boolean(sourceUrl),
hasSourceComposition: Boolean(sourceComposition),
hasPipelineHash: Boolean(currentPipelineHash), hasPipelineHash: Boolean(currentPipelineHash),
isOffline: status.isOffline, isOffline: status.isOffline,
}); });
@@ -769,7 +779,8 @@ export default function RenderNode({ id, data, selected, width, height }: NodePr
}); });
const renderResult = await renderFullWithWorkerFallback({ const renderResult = await renderFullWithWorkerFallback({
sourceUrl, sourceUrl: sourceUrl ?? undefined,
sourceComposition,
steps, steps,
render: { render: {
resolution: activeData.outputResolution, resolution: activeData.outputResolution,

View File

@@ -22,6 +22,25 @@ function logNodeDataDebug(event: string, payload: Record<string, unknown>): void
console.info("[Canvas node debug]", event, payload); console.info("[Canvas node debug]", event, payload);
} }
function diffNodeData(
before: Record<string, unknown>,
after: Record<string, unknown>,
): Record<string, { before: unknown; after: unknown }> {
const keys = new Set([...Object.keys(before), ...Object.keys(after)]);
const diff: Record<string, { before: unknown; after: unknown }> = {};
for (const key of keys) {
if (before[key] !== after[key]) {
diff[key] = {
before: before[key],
after: after[key],
};
}
}
return diff;
}
export function useNodeLocalData<T>({ export function useNodeLocalData<T>({
nodeId, nodeId,
data, data,
@@ -55,6 +74,16 @@ export function useNodeLocalData<T>({
const savedValue = localDataRef.current; const savedValue = localDataRef.current;
const savedVersion = localChangeVersionRef.current; const savedVersion = localChangeVersionRef.current;
logNodeDataDebug("queue-save-flush", {
nodeId,
nodeType: debugLabel,
savedVersion,
changedFields: diffNodeData(
acceptedPersistedDataRef.current as Record<string, unknown>,
savedValue as Record<string, unknown>,
),
});
Promise.resolve(onSave(savedValue)) Promise.resolve(onSave(savedValue))
.then(() => { .then(() => {
if (!isMountedRef.current || savedVersion !== localChangeVersionRef.current) { if (!isMountedRef.current || savedVersion !== localChangeVersionRef.current) {
@@ -144,7 +173,17 @@ export function useNodeLocalData<T>({
const updateLocalData = useCallback( const updateLocalData = useCallback(
(updater: (current: T) => T) => { (updater: (current: T) => T) => {
const next = updater(localDataRef.current); const previous = localDataRef.current;
const next = updater(previous);
logNodeDataDebug("local-update", {
nodeId,
nodeType: debugLabel,
changedFields: diffNodeData(
previous as Record<string, unknown>,
next as Record<string, unknown>,
),
});
localChangeVersionRef.current += 1; localChangeVersionRef.current += 1;
hasPendingLocalChangesRef.current = true; hasPendingLocalChangesRef.current = true;
@@ -153,7 +192,7 @@ export function useNodeLocalData<T>({
setPreviewNodeDataOverride(nodeId, next); setPreviewNodeDataOverride(nodeId, next);
queueSave(); queueSave();
}, },
[nodeId, queueSave, setPreviewNodeDataOverride], [debugLabel, nodeId, queueSave, setPreviewNodeDataOverride],
); );
return { return {

View File

@@ -414,7 +414,7 @@ export function useCanvasConnections({
!isOptimisticEdgeId(edge.id), !isOptimisticEdgeId(edge.id),
); );
const incomingEdge = incomingEdges.length === 1 ? incomingEdges[0] : undefined; const incomingEdge = incomingEdges.length === 1 ? incomingEdges[0] : undefined;
const splitValidationError = const shouldAttemptAutoSplit =
validationError === "adjustment-incoming-limit" && validationError === "adjustment-incoming-limit" &&
droppedConnection.sourceNodeId === fromNode.id && droppedConnection.sourceNodeId === fromNode.id &&
fromHandle.type === "source" && fromHandle.type === "source" &&
@@ -424,7 +424,8 @@ export function useCanvasConnections({
hasHandleKey(splitHandles, "target") && hasHandleKey(splitHandles, "target") &&
incomingEdge !== undefined && incomingEdge !== undefined &&
incomingEdge.source !== fullFromNode.id && incomingEdge.source !== fullFromNode.id &&
incomingEdge.target !== fullFromNode.id incomingEdge.target !== fullFromNode.id;
const splitValidationError = shouldAttemptAutoSplit
? validateCanvasEdgeSplit({ ? validateCanvasEdgeSplit({
nodes: nodesRef.current, nodes: nodesRef.current,
edges: edgesRef.current, edges: edgesRef.current,
@@ -433,7 +434,21 @@ export function useCanvasConnections({
}) })
: null; : null;
if (!splitValidationError && incomingEdge && fullFromNode && splitHandles) { logCanvasConnectionDebug("connect:end-auto-split-eval", {
point: pt,
flow,
droppedConnection,
validationError,
shouldAttemptAutoSplit,
splitValidationError,
fromNodeId: fromNode.id,
fromNodeType: fullFromNode?.type ?? null,
incomingEdgeId: incomingEdge?.id ?? null,
incomingEdgeSourceNodeId: incomingEdge?.source ?? null,
incomingEdgeTargetNodeId: incomingEdge?.target ?? null,
});
if (shouldAttemptAutoSplit && !splitValidationError && incomingEdge && fullFromNode && splitHandles) {
logCanvasConnectionDebug("connect:end-auto-split", { logCanvasConnectionDebug("connect:end-auto-split", {
point: pt, point: pt,
flow, flow,

View File

@@ -58,7 +58,7 @@ Alle Node-Typen werden über Validators definiert: `phase1NodeTypeValidator`, `n
| `video-prompt` | `content`, `modelId`, `durationSeconds` | KI-Video-Steuer-Node (Eingabe) | | `video-prompt` | `content`, `modelId`, `durationSeconds` | KI-Video-Steuer-Node (Eingabe) |
| `ai-video` | `storageId`, `prompt`, `model`, `modelLabel`, `durationSeconds`, `creditCost`, `generatedAt`, `taskId` (transient) | Generiertes KI-Video (System-Output) | | `ai-video` | `storageId`, `prompt`, `model`, `modelLabel`, `durationSeconds`, `creditCost`, `generatedAt`, `taskId` (transient) | Generiertes KI-Video (System-Output) |
| `compare` | `leftNodeId`, `rightNodeId`, `sliderPosition` | Vergleichs-Node | | `compare` | `leftNodeId`, `rightNodeId`, `sliderPosition` | Vergleichs-Node |
| `mixer` | `blendMode`, `opacity`, `offsetX`, `offsetY` | V1 Merge-Control-Node mit pseudo-image Output (kein Storage-Write) | | `mixer` | `blendMode`, `opacity`, `overlayX`, `overlayY`, `overlayWidth`, `overlayHeight` | V1 Merge-Control-Node mit pseudo-image Output (kein Storage-Write) |
| `frame` | `label`, `exportWidth`, `exportHeight`, `backgroundColor` | Artboard | | `frame` | `label`, `exportWidth`, `exportHeight`, `backgroundColor` | Artboard |
| `group` | `label`, `collapsed` | Container-Node | | `group` | `label`, `collapsed` | Container-Node |
| `note` | `content`, `color` | Anmerkung | | `note` | `content`, `color` | Anmerkung |
@@ -338,6 +338,8 @@ Wirft bei unauthentifiziertem Zugriff. Wird von allen Queries und Mutations genu
- `mixer` ist ein Control-Node mit pseudo-image Semantik, nicht mit persistiertem Medien-Output. - `mixer` ist ein Control-Node mit pseudo-image Semantik, nicht mit persistiertem Medien-Output.
- Keine zusaetzlichen Convex-Tabellen oder Storage-Flows fuer Mixer-Vorschauen. - Keine zusaetzlichen Convex-Tabellen oder Storage-Flows fuer Mixer-Vorschauen.
- Validierung laeuft client- und serverseitig ueber dieselbe Policy (`validateCanvasConnectionPolicy`); `edges.ts` delegiert darauf fuer Paritaet. - Validierung laeuft client- und serverseitig ueber dieselbe Policy (`validateCanvasConnectionPolicy`); `edges.ts` delegiert darauf fuer Paritaet.
- Offizieller Bake-Pfad fuer Mixer ist `mixer -> render` (Render verarbeitet die Mixer-Komposition in Preview/Render-Pipeline).
- `mixer -> adjustments -> render` ist derzeit bewusst deferred und nicht Teil des offiziell supporteten Flows.
--- ---

View File

@@ -0,0 +1,92 @@
# Mixer Resize/Crop Design
**Goal:** Make mixer overlay resize behave like proportional image scaling, and make crop behave like classic edge-based trimming without changing displayed image size.
## Approved Interaction Model
- `Resize` changes the displayed overlay size only.
- `Resize` keeps aspect ratio locked.
- `Crop` changes only the visible source region.
- `Crop` does not change the displayed overlay frame size.
- `Crop` uses 8 handles: 4 corners and 4 side-midpoints.
- Dragging inside the crop box repositions the crop region.
## Conceptual Split
### 1. Display Frame
Controls where and how large the overlay appears in the mixer.
- `overlayX`
- `overlayY`
- `overlayWidth`
- `overlayHeight`
These fields represent the displayed overlay frame in mixer preview/output space.
### 2. Source Crop Region
Controls which part of the source image is shown inside that frame.
Recommended crop contract:
- `cropLeft`
- `cropTop`
- `cropRight`
- `cropBottom`
All values are normalized source-image trims from the corresponding edge.
Why this model:
- Left handle changes only `cropLeft`
- Top handle changes only `cropTop`
- Corner handles combine two crop edges
- The mental model exactly matches "take content away from edges"
## Rendering Semantics
Preview, compare, and bake must all use the same mapping:
1. Resolve source image.
2. Apply crop trims to derive the sampled source rect.
3. Draw that sampled rect into the displayed overlay frame.
This removes the ambiguous zoom-like behavior from crop mode.
## UX Rules
### Resize Mode
- Handles are anchored to the display frame.
- Corner drag scales proportionally.
- Side handles are either hidden or mapped to proportional scaling from the nearest axis while preserving aspect ratio.
- Resize never mutates crop fields.
### Crop Mode
- Handles are anchored to the crop box.
- Edge handles trim one side.
- Corner handles trim two sides.
- Drag inside crop box repositions the crop window.
- Crop never mutates display frame size.
## Constraints
- Minimum display size must keep handles usable.
- Minimum crop region must prevent inverted or zero-area crop boxes.
- Crop box stays within source bounds.
- Display frame stays within mixer preview bounds.
## Backward Compatibility
- Existing mixer nodes with `contentX/Y/Width/Height` need a migration/default path.
- If a direct field migration is too risky, normalization can temporarily map legacy content fields into equivalent crop trims.
## Non-Goals
- rotation
- masks
- free distortion
- multi-layer cropping
- standalone crop modal

View File

@@ -0,0 +1,198 @@
# Mixer Resize/Crop Implementation Plan
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
**Goal:** Correct mixer interactions so resize scales the overlay proportionally while crop trims visible content from any side without changing displayed image size.
**Architecture:** Split displayed overlay geometry from source crop geometry. Keep `overlayX/Y/Width/Height` for display frame placement and size. Introduce explicit crop-edge semantics so preview, compare, and bake all trim the same source region and map it into the unchanged frame.
**Tech Stack:** Next.js 16, React 19, `@xyflow/react`, Vitest, local node preview state via `useNodeLocalData`, image pipeline source-loader.
---
### Task 1: Add failing tests for the approved resize/crop semantics
**Files:**
- Modify: `components/canvas/__tests__/mixer-node.test.tsx`
- Modify: `tests/image-pipeline/source-loader.test.ts`
- Modify: `tests/lib/canvas-mixer-preview.test.ts`
**Step 1: Write the failing tests**
Add tests that prove:
- frame resize keeps aspect ratio locked
- crop handle drag trims edges without changing displayed overlay frame size
- crop drag inside crop box repositions crop region only
- resize does not mutate crop fields
- crop does not mutate `overlayWidth` / `overlayHeight`
**Step 2: Run tests to verify RED**
Run:
```bash
pnpm exec vitest run components/canvas/__tests__/mixer-node.test.tsx tests/image-pipeline/source-loader.test.ts tests/lib/canvas-mixer-preview.test.ts
```
Expected: failures showing current crop behavior still behaves like zoom/scale instead of edge trimming.
**Step 3: Commit**
```bash
git add components/canvas/__tests__/mixer-node.test.tsx tests/image-pipeline/source-loader.test.ts tests/lib/canvas-mixer-preview.test.ts
git commit -m "test(canvas): cover mixer resize and crop semantics"
```
---
### Task 2: Replace zoom-like content fields with crop-edge normalization
**Files:**
- Modify: `lib/canvas-mixer-preview.ts`
- Modify: `lib/canvas-utils.ts`
- Modify: `lib/canvas-node-templates.ts`
- Modify: `components/canvas/nodes/mixer-node.tsx`
**Step 1: Implement minimal normalized crop model**
Prefer explicit crop trims:
```ts
type MixerCropData = {
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
};
```
Normalization rules:
- clamp each crop edge to `0..1`
- enforce minimum remaining source width/height
- preserve display frame fields separately
- map legacy `contentX/Y/Width/Height` into equivalent crop trims during normalization if needed
**Step 2: Run focused tests**
Run:
```bash
pnpm exec vitest run tests/lib/canvas-mixer-preview.test.ts
```
Expected: GREEN for normalization and backward-compatibility cases.
**Step 3: Commit**
```bash
git add lib/canvas-mixer-preview.ts lib/canvas-utils.ts lib/canvas-node-templates.ts components/canvas/nodes/mixer-node.tsx tests/lib/canvas-mixer-preview.test.ts
git commit -m "feat(canvas): add explicit mixer crop edge model"
```
---
### Task 3: Fix mixer node interactions
**Files:**
- Modify: `components/canvas/nodes/mixer-node.tsx`
- Modify: `components/canvas/__tests__/mixer-node.test.tsx`
**Step 1: Implement proportional resize**
- use display frame aspect ratio as the locked ratio
- corner drag scales frame proportionally
- side handles either hide in resize mode or preserve ratio while scaling
- resize mutates only `overlay*`
**Step 2: Implement classic crop handles**
- render 8 crop handles in crop mode
- edge handles trim one side
- corner handles trim two sides
- dragging inside crop box repositions crop region
- crop mutates only crop fields
**Step 3: Run focused tests**
Run:
```bash
pnpm exec vitest run components/canvas/__tests__/mixer-node.test.tsx
```
Expected: GREEN for resize and crop semantics.
**Step 4: Commit**
```bash
git add components/canvas/nodes/mixer-node.tsx components/canvas/__tests__/mixer-node.test.tsx
git commit -m "feat(canvas): separate mixer resize and crop interactions"
```
---
### Task 4: Align compare and bake semantics
**Files:**
- Modify: `components/canvas/nodes/compare-surface.tsx`
- Modify: `lib/image-pipeline/source-loader.ts`
- Modify: `tests/image-pipeline/source-loader.test.ts`
- Modify: `tests/lib/canvas-render-preview.test.ts`
- Optional modify: `components/canvas/__tests__/compare-node.test.tsx`
**Step 1: Implement crop-edge sampling everywhere**
- compare preview uses crop edges, not zoom-like content scaling
- bake path samples cropped source region into overlay frame
- non-mixer behavior stays unchanged
**Step 2: Run focused tests**
Run:
```bash
pnpm exec vitest run tests/image-pipeline/source-loader.test.ts tests/lib/canvas-render-preview.test.ts components/canvas/__tests__/compare-node.test.tsx
```
Expected: GREEN with preview/bake parity.
**Step 3: Commit**
```bash
git add components/canvas/nodes/compare-surface.tsx lib/image-pipeline/source-loader.ts tests/image-pipeline/source-loader.test.ts tests/lib/canvas-render-preview.test.ts components/canvas/__tests__/compare-node.test.tsx
git commit -m "fix(canvas): align mixer crop semantics across preview and bake"
```
---
### Task 5: Final verification
**Files:**
- Modify only if docs or small follow-up fixes are needed
**Step 1: Run the verification suite**
```bash
pnpm exec vitest run components/canvas/__tests__/mixer-node.test.tsx tests/lib/canvas-mixer-preview.test.ts tests/lib/canvas-render-preview.test.ts tests/image-pipeline/source-loader.test.ts components/canvas/__tests__/compare-node.test.tsx
pnpm lint
pnpm build
```
Expected: all green.
**Step 2: Commit docs/follow-ups if needed**
```bash
git add components/canvas/CLAUDE.md convex/CLAUDE.md
git commit -m "docs(canvas): document mixer resize and crop semantics"
```
---
## Notes
- Keep `nodrag` and `nopan` on every interactive surface and handle.
- Prefer the smallest migration path from legacy `content*` fields into crop trims.
- Do not broaden into rotation, masks, or non-uniform scaling.

View File

@@ -4,6 +4,7 @@ import { useEffect, useMemo, useRef, useState } from "react";
import { hashPipeline, type PipelineStep } from "@/lib/image-pipeline/contracts"; import { hashPipeline, type PipelineStep } from "@/lib/image-pipeline/contracts";
import { emptyHistogram, type HistogramData } from "@/lib/image-pipeline/histogram"; import { emptyHistogram, type HistogramData } from "@/lib/image-pipeline/histogram";
import type { RenderSourceComposition } from "@/lib/image-pipeline/render-types";
import { import {
isPipelineAbortError, isPipelineAbortError,
renderPreviewWithWorkerFallback, renderPreviewWithWorkerFallback,
@@ -12,6 +13,7 @@ import {
type UsePipelinePreviewOptions = { type UsePipelinePreviewOptions = {
sourceUrl: string | null; sourceUrl: string | null;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
nodeWidth: number; nodeWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
@@ -54,6 +56,7 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
const stableRenderInputRef = useRef<{ const stableRenderInputRef = useRef<{
pipelineHash: string; pipelineHash: string;
sourceUrl: string | null; sourceUrl: string | null;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
} | null>(null); } | null>(null);
@@ -95,11 +98,11 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
); );
const pipelineHash = useMemo(() => { const pipelineHash = useMemo(() => {
if (!options.sourceUrl) { if (!options.sourceUrl && !options.sourceComposition) {
return "no-source"; return "no-source";
} }
return hashPipeline(options.sourceUrl, options.steps); return hashPipeline(options.sourceComposition ?? options.sourceUrl, options.steps);
}, [options.sourceUrl, options.steps]); }, [options.sourceComposition, options.sourceUrl, options.steps]);
useEffect(() => { useEffect(() => {
if (stableRenderInputRef.current?.pipelineHash === pipelineHash) { if (stableRenderInputRef.current?.pipelineHash === pipelineHash) {
@@ -109,13 +112,15 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
stableRenderInputRef.current = { stableRenderInputRef.current = {
pipelineHash, pipelineHash,
sourceUrl: options.sourceUrl, sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
steps: options.steps, steps: options.steps,
}; };
}, [pipelineHash, options.sourceUrl, options.steps]); }, [pipelineHash, options.sourceComposition, options.sourceUrl, options.steps]);
useEffect(() => { useEffect(() => {
const sourceUrl = stableRenderInputRef.current?.sourceUrl ?? null; const sourceUrl = stableRenderInputRef.current?.sourceUrl ?? null;
if (!sourceUrl) { const sourceComposition = stableRenderInputRef.current?.sourceComposition;
if (!sourceUrl && !sourceComposition) {
const frameId = window.requestAnimationFrame(() => { const frameId = window.requestAnimationFrame(() => {
setHistogram(emptyHistogram()); setHistogram(emptyHistogram());
setError(null); setError(null);
@@ -133,8 +138,10 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
const timer = window.setTimeout(() => { const timer = window.setTimeout(() => {
setIsRendering(true); setIsRendering(true);
setError(null); setError(null);
const resolvedSourceUrl = sourceUrl ?? undefined;
void renderPreviewWithWorkerFallback({ void renderPreviewWithWorkerFallback({
sourceUrl, sourceUrl: resolvedSourceUrl,
sourceComposition,
steps: stableRenderInputRef.current?.steps ?? [], steps: stableRenderInputRef.current?.steps ?? [],
previewWidth, previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
@@ -168,7 +175,8 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
if (process.env.NODE_ENV !== "production") { if (process.env.NODE_ENV !== "production") {
console.error("[usePipelinePreview] render failed", { console.error("[usePipelinePreview] render failed", {
message, message,
sourceUrl, sourceUrl: resolvedSourceUrl,
sourceComposition,
pipelineHash, pipelineHash,
previewWidth, previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
@@ -194,7 +202,7 @@ export function usePipelinePreview(options: UsePipelinePreviewOptions): {
canvasRef, canvasRef,
histogram, histogram,
isRendering, isRendering,
hasSource: Boolean(options.sourceUrl), hasSource: Boolean(options.sourceUrl || options.sourceComposition),
previewAspectRatio, previewAspectRatio,
error, error,
}; };

View File

@@ -30,6 +30,7 @@ const RENDER_ALLOWED_SOURCE_TYPES = new Set<string>([
"image", "image",
"asset", "asset",
"ai-image", "ai-image",
"mixer",
"crop", "crop",
"curves", "curves",
"color-adjust", "color-adjust",
@@ -209,7 +210,7 @@ export function getCanvasConnectionValidationMessage(
case "adjustment-target-forbidden": case "adjustment-target-forbidden":
return "Adjustment-Ausgaben koennen nicht an Prompt- oder KI-Bild-Nodes angeschlossen werden."; return "Adjustment-Ausgaben koennen nicht an Prompt- oder KI-Bild-Nodes angeschlossen werden.";
case "render-source-invalid": case "render-source-invalid":
return "Render akzeptiert nur Bild-, Asset-, KI-Bild-, Crop- oder Adjustment-Input."; return "Render akzeptiert nur Bild-, Asset-, KI-Bild-, Mixer-, Crop- oder Adjustment-Input.";
case "agent-source-invalid": case "agent-source-invalid":
return "Agent-Nodes akzeptieren nur Content- und Kontext-Inputs, keine Generierungs-Steuerknoten wie Prompt."; return "Agent-Nodes akzeptieren nur Content- und Kontext-Inputs, keine Generierungs-Steuerknoten wie Prompt.";
case "agent-output-source-invalid": case "agent-output-source-invalid":

View File

@@ -19,8 +19,14 @@ export type MixerPreviewState = {
overlayUrl?: string; overlayUrl?: string;
blendMode: MixerBlendMode; blendMode: MixerBlendMode;
opacity: number; opacity: number;
offsetX: number; overlayX: number;
offsetY: number; overlayY: number;
overlayWidth: number;
overlayHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
error?: MixerPreviewError; error?: MixerPreviewError;
}; };
@@ -35,9 +41,18 @@ const DEFAULT_BLEND_MODE: MixerBlendMode = "normal";
const DEFAULT_OPACITY = 100; const DEFAULT_OPACITY = 100;
const MIN_OPACITY = 0; const MIN_OPACITY = 0;
const MAX_OPACITY = 100; const MAX_OPACITY = 100;
const DEFAULT_OFFSET = 0; const DEFAULT_OVERLAY_X = 0;
const MIN_OFFSET = -2048; const DEFAULT_OVERLAY_Y = 0;
const MAX_OFFSET = 2048; const DEFAULT_OVERLAY_WIDTH = 1;
const DEFAULT_OVERLAY_HEIGHT = 1;
const DEFAULT_CROP_LEFT = 0;
const DEFAULT_CROP_TOP = 0;
const DEFAULT_CROP_RIGHT = 0;
const DEFAULT_CROP_BOTTOM = 0;
const MIN_OVERLAY_POSITION = 0;
const MAX_OVERLAY_POSITION = 1;
const MIN_OVERLAY_SIZE = 0.1;
const MAX_OVERLAY_SIZE = 1;
function clamp(value: number, min: number, max: number): number { function clamp(value: number, min: number, max: number): number {
return Math.max(min, Math.min(max, value)); return Math.max(min, Math.min(max, value));
@@ -65,18 +80,165 @@ function normalizeOpacity(value: unknown): number {
return clamp(parsed, MIN_OPACITY, MAX_OPACITY); return clamp(parsed, MIN_OPACITY, MAX_OPACITY);
} }
function normalizeOffset(value: unknown): number { function normalizeOverlayNumber(value: unknown, fallback: number): number {
const parsed = parseNumeric(value); const parsed = parseNumeric(value);
if (parsed === null) { if (parsed === null) {
return DEFAULT_OFFSET; return fallback;
} }
return clamp(parsed, MIN_OFFSET, MAX_OFFSET); return parsed;
}
function normalizeUnitRect(args: {
x: unknown;
y: unknown;
width: unknown;
height: unknown;
defaults: { x: number; y: number; width: number; height: number };
}): { x: number; y: number; width: number; height: number } {
const x = clamp(
normalizeOverlayNumber(args.x, args.defaults.x),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const y = clamp(
normalizeOverlayNumber(args.y, args.defaults.y),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const width = clamp(
normalizeOverlayNumber(args.width, args.defaults.width),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - x),
);
const height = clamp(
normalizeOverlayNumber(args.height, args.defaults.height),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - y),
);
return { x, y, width, height };
}
function normalizeOverlayRect(record: Record<string, unknown>): Pick<
MixerPreviewState,
"overlayX" | "overlayY" | "overlayWidth" | "overlayHeight"
> {
const hasLegacyOffset = record.offsetX !== undefined || record.offsetY !== undefined;
const hasOverlayRectField =
record.overlayX !== undefined ||
record.overlayY !== undefined ||
record.overlayWidth !== undefined ||
record.overlayHeight !== undefined;
if (hasLegacyOffset && !hasOverlayRectField) {
return {
overlayX: DEFAULT_OVERLAY_X,
overlayY: DEFAULT_OVERLAY_Y,
overlayWidth: DEFAULT_OVERLAY_WIDTH,
overlayHeight: DEFAULT_OVERLAY_HEIGHT,
};
}
const normalized = normalizeUnitRect({
x: record.overlayX,
y: record.overlayY,
width: record.overlayWidth,
height: record.overlayHeight,
defaults: {
x: DEFAULT_OVERLAY_X,
y: DEFAULT_OVERLAY_Y,
width: DEFAULT_OVERLAY_WIDTH,
height: DEFAULT_OVERLAY_HEIGHT,
},
});
return {
overlayX: normalized.x,
overlayY: normalized.y,
overlayWidth: normalized.width,
overlayHeight: normalized.height,
};
}
function normalizeCropEdges(record: Record<string, unknown>): Pick<
MixerPreviewState,
"cropLeft" | "cropTop" | "cropRight" | "cropBottom"
> {
const hasCropField =
record.cropLeft !== undefined ||
record.cropTop !== undefined ||
record.cropRight !== undefined ||
record.cropBottom !== undefined;
const hasLegacyContentRectField =
record.contentX !== undefined ||
record.contentY !== undefined ||
record.contentWidth !== undefined ||
record.contentHeight !== undefined;
if (!hasCropField && hasLegacyContentRectField) {
const legacyRect = normalizeUnitRect({
x: record.contentX,
y: record.contentY,
width: record.contentWidth,
height: record.contentHeight,
defaults: {
x: 0,
y: 0,
width: 1,
height: 1,
},
});
return {
cropLeft: legacyRect.x,
cropTop: legacyRect.y,
cropRight: 1 - (legacyRect.x + legacyRect.width),
cropBottom: 1 - (legacyRect.y + legacyRect.height),
};
}
const cropLeft = clamp(
normalizeOverlayNumber(record.cropLeft, DEFAULT_CROP_LEFT),
0,
1 - MIN_OVERLAY_SIZE,
);
const cropTop = clamp(
normalizeOverlayNumber(record.cropTop, DEFAULT_CROP_TOP),
0,
1 - MIN_OVERLAY_SIZE,
);
const cropRight = clamp(
normalizeOverlayNumber(record.cropRight, DEFAULT_CROP_RIGHT),
0,
1 - cropLeft - MIN_OVERLAY_SIZE,
);
const cropBottom = clamp(
normalizeOverlayNumber(record.cropBottom, DEFAULT_CROP_BOTTOM),
0,
1 - cropTop - MIN_OVERLAY_SIZE,
);
return {
cropLeft,
cropTop,
cropRight,
cropBottom,
};
} }
export function normalizeMixerPreviewData(data: unknown): Pick< export function normalizeMixerPreviewData(data: unknown): Pick<
MixerPreviewState, MixerPreviewState,
"blendMode" | "opacity" | "offsetX" | "offsetY" | "blendMode"
| "opacity"
| "overlayX"
| "overlayY"
| "overlayWidth"
| "overlayHeight"
| "cropLeft"
| "cropTop"
| "cropRight"
| "cropBottom"
> { > {
const record = (data ?? {}) as Record<string, unknown>; const record = (data ?? {}) as Record<string, unknown>;
const blendMode = MIXER_BLEND_MODES.has(record.blendMode as MixerBlendMode) const blendMode = MIXER_BLEND_MODES.has(record.blendMode as MixerBlendMode)
@@ -86,8 +248,8 @@ export function normalizeMixerPreviewData(data: unknown): Pick<
return { return {
blendMode, blendMode,
opacity: normalizeOpacity(record.opacity), opacity: normalizeOpacity(record.opacity),
offsetX: normalizeOffset(record.offsetX), ...normalizeOverlayRect(record),
offsetY: normalizeOffset(record.offsetY), ...normalizeCropEdges(record),
}; };
} }
@@ -119,6 +281,17 @@ function resolveSourceUrlFromNode(args: {
} }
if (args.sourceNode.type === "render") { if (args.sourceNode.type === "render") {
const preview = resolveRenderPreviewInputFromGraph({
nodeId: args.sourceNode.id,
graph: args.graph,
});
if (preview.sourceComposition) {
return undefined;
}
if (preview.sourceUrl) {
return preview.sourceUrl;
}
const renderData = (args.sourceNode.data ?? {}) as Record<string, unknown>; const renderData = (args.sourceNode.data ?? {}) as Record<string, unknown>;
const renderOutputUrl = const renderOutputUrl =
typeof renderData.lastUploadUrl === "string" && renderData.lastUploadUrl.length > 0 typeof renderData.lastUploadUrl === "string" && renderData.lastUploadUrl.length > 0
@@ -133,11 +306,7 @@ function resolveSourceUrlFromNode(args: {
return directRenderUrl; return directRenderUrl;
} }
const preview = resolveRenderPreviewInputFromGraph({ return undefined;
nodeId: args.sourceNode.id,
graph: args.graph,
});
return preview.sourceUrl ?? undefined;
} }
return resolveNodeImageUrl(args.sourceNode.data) ?? undefined; return resolveNodeImageUrl(args.sourceNode.data) ?? undefined;
@@ -172,6 +341,8 @@ export function resolveMixerPreviewFromGraph(args: {
if (base.duplicate || overlay.duplicate) { if (base.duplicate || overlay.duplicate) {
return { return {
status: "error", status: "error",
baseUrl: undefined,
overlayUrl: undefined,
...normalized, ...normalized,
error: "duplicate-handle-edge", error: "duplicate-handle-edge",
}; };

View File

@@ -51,8 +51,14 @@ export const CANVAS_NODE_TEMPLATES = [
defaultData: { defaultData: {
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}, },
}, },
{ {

View File

@@ -15,10 +15,29 @@ export type RenderPreviewGraphEdge = {
}; };
export type RenderPreviewInput = { export type RenderPreviewInput = {
sourceUrl: string; sourceUrl: string | null;
sourceComposition?: RenderPreviewSourceComposition;
steps: PipelineStep[]; steps: PipelineStep[];
}; };
export type MixerBlendMode = "normal" | "multiply" | "screen" | "overlay";
export type RenderPreviewSourceComposition = {
kind: "mixer";
baseUrl: string;
overlayUrl: string;
blendMode: MixerBlendMode;
opacity: number;
overlayX: number;
overlayY: number;
overlayWidth: number;
overlayHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
};
export type CanvasGraphNodeLike = { export type CanvasGraphNodeLike = {
id: string; id: string;
type: string; type: string;
@@ -38,6 +57,8 @@ export type CanvasGraphSnapshot = {
incomingEdgesByTarget: ReadonlyMap<string, readonly CanvasGraphEdgeLike[]>; incomingEdgesByTarget: ReadonlyMap<string, readonly CanvasGraphEdgeLike[]>;
}; };
type RenderPreviewResolvedInput = RenderPreviewInput;
export type CanvasGraphNodeDataOverrides = ReadonlyMap<string, unknown>; export type CanvasGraphNodeDataOverrides = ReadonlyMap<string, unknown>;
export function shouldFastPathPreviewPipeline( export function shouldFastPathPreviewPipeline(
@@ -129,6 +150,188 @@ export const RENDER_PREVIEW_PIPELINE_TYPES = new Set([
"detail-adjust", "detail-adjust",
]); ]);
const MIXER_SOURCE_NODE_TYPES = new Set(["image", "asset", "ai-image", "render"]);
const MIXER_BLEND_MODES = new Set<MixerBlendMode>([
"normal",
"multiply",
"screen",
"overlay",
]);
const DEFAULT_BLEND_MODE: MixerBlendMode = "normal";
const DEFAULT_OPACITY = 100;
const MIN_OPACITY = 0;
const MAX_OPACITY = 100;
const DEFAULT_OVERLAY_X = 0;
const DEFAULT_OVERLAY_Y = 0;
const DEFAULT_OVERLAY_WIDTH = 1;
const DEFAULT_OVERLAY_HEIGHT = 1;
const DEFAULT_CROP_LEFT = 0;
const DEFAULT_CROP_TOP = 0;
const DEFAULT_CROP_RIGHT = 0;
const DEFAULT_CROP_BOTTOM = 0;
const MIN_OVERLAY_POSITION = 0;
const MAX_OVERLAY_POSITION = 1;
const MIN_OVERLAY_SIZE = 0.1;
const MAX_OVERLAY_SIZE = 1;
function clamp(value: number, min: number, max: number): number {
return Math.max(min, Math.min(max, value));
}
function parseNumeric(value: unknown): number | null {
if (typeof value === "number") {
return Number.isFinite(value) ? value : null;
}
if (typeof value === "string") {
const parsed = Number(value);
return Number.isFinite(parsed) ? parsed : null;
}
return null;
}
function normalizeOpacity(value: unknown): number {
const parsed = parseNumeric(value);
if (parsed === null) {
return DEFAULT_OPACITY;
}
return clamp(parsed, MIN_OPACITY, MAX_OPACITY);
}
function normalizeOverlayNumber(value: unknown, fallback: number): number {
const parsed = parseNumeric(value);
if (parsed === null) {
return fallback;
}
return parsed;
}
function normalizeMixerCompositionRect(data: Record<string, unknown>): Pick<
RenderPreviewSourceComposition,
"overlayX" | "overlayY" | "overlayWidth" | "overlayHeight"
> {
const hasLegacyOffset = data.offsetX !== undefined || data.offsetY !== undefined;
const hasOverlayRectField =
data.overlayX !== undefined ||
data.overlayY !== undefined ||
data.overlayWidth !== undefined ||
data.overlayHeight !== undefined;
if (hasLegacyOffset && !hasOverlayRectField) {
return {
overlayX: DEFAULT_OVERLAY_X,
overlayY: DEFAULT_OVERLAY_Y,
overlayWidth: DEFAULT_OVERLAY_WIDTH,
overlayHeight: DEFAULT_OVERLAY_HEIGHT,
};
}
const overlayX = clamp(
normalizeOverlayNumber(data.overlayX, DEFAULT_OVERLAY_X),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const overlayY = clamp(
normalizeOverlayNumber(data.overlayY, DEFAULT_OVERLAY_Y),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const overlayWidth = clamp(
normalizeOverlayNumber(data.overlayWidth, DEFAULT_OVERLAY_WIDTH),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - overlayX),
);
const overlayHeight = clamp(
normalizeOverlayNumber(data.overlayHeight, DEFAULT_OVERLAY_HEIGHT),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - overlayY),
);
return {
overlayX,
overlayY,
overlayWidth,
overlayHeight,
};
}
function normalizeMixerCompositionCropEdges(data: Record<string, unknown>): Pick<
RenderPreviewSourceComposition,
"cropLeft" | "cropTop" | "cropRight" | "cropBottom"
> {
const hasCropField =
data.cropLeft !== undefined ||
data.cropTop !== undefined ||
data.cropRight !== undefined ||
data.cropBottom !== undefined;
const hasLegacyContentRectField =
data.contentX !== undefined ||
data.contentY !== undefined ||
data.contentWidth !== undefined ||
data.contentHeight !== undefined;
if (!hasCropField && hasLegacyContentRectField) {
const contentX = clamp(
normalizeOverlayNumber(data.contentX, 0),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const contentY = clamp(
normalizeOverlayNumber(data.contentY, 0),
MIN_OVERLAY_POSITION,
MAX_OVERLAY_POSITION - MIN_OVERLAY_SIZE,
);
const contentWidth = clamp(
normalizeOverlayNumber(data.contentWidth, 1),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - contentX),
);
const contentHeight = clamp(
normalizeOverlayNumber(data.contentHeight, 1),
MIN_OVERLAY_SIZE,
Math.min(MAX_OVERLAY_SIZE, MAX_OVERLAY_POSITION - contentY),
);
return {
cropLeft: contentX,
cropTop: contentY,
cropRight: 1 - (contentX + contentWidth),
cropBottom: 1 - (contentY + contentHeight),
};
}
const cropLeft = clamp(
normalizeOverlayNumber(data.cropLeft, DEFAULT_CROP_LEFT),
0,
1 - MIN_OVERLAY_SIZE,
);
const cropTop = clamp(
normalizeOverlayNumber(data.cropTop, DEFAULT_CROP_TOP),
0,
1 - MIN_OVERLAY_SIZE,
);
const cropRight = clamp(
normalizeOverlayNumber(data.cropRight, DEFAULT_CROP_RIGHT),
0,
1 - cropLeft - MIN_OVERLAY_SIZE,
);
const cropBottom = clamp(
normalizeOverlayNumber(data.cropBottom, DEFAULT_CROP_BOTTOM),
0,
1 - cropTop - MIN_OVERLAY_SIZE,
);
return {
cropLeft,
cropTop,
cropRight,
cropBottom,
};
}
export function resolveRenderFingerprint(data: unknown): { export function resolveRenderFingerprint(data: unknown): {
resolution: RenderResolutionOption; resolution: RenderResolutionOption;
customWidth?: number; customWidth?: number;
@@ -163,15 +366,19 @@ export function resolveRenderFingerprint(data: unknown): {
export function resolveRenderPipelineHash(args: { export function resolveRenderPipelineHash(args: {
sourceUrl: string | null; sourceUrl: string | null;
sourceComposition?: RenderPreviewSourceComposition;
steps: PipelineStep[]; steps: PipelineStep[];
data: unknown; data: unknown;
}): string | null { }): string | null {
if (!args.sourceUrl) { if (!args.sourceUrl && !args.sourceComposition) {
return null; return null;
} }
return hashPipeline( return hashPipeline(
{ sourceUrl: args.sourceUrl, render: resolveRenderFingerprint(args.data) }, {
source: args.sourceComposition ?? args.sourceUrl,
render: resolveRenderFingerprint(args.data),
},
args.steps, args.steps,
); );
} }
@@ -212,6 +419,119 @@ function resolveSourceNodeUrl(node: CanvasGraphNodeLike): string | null {
return resolveNodeImageUrl(node.data); return resolveNodeImageUrl(node.data);
} }
function resolveRenderOutputUrl(node: CanvasGraphNodeLike): string | null {
const data = (node.data ?? {}) as Record<string, unknown>;
const lastUploadUrl =
typeof data.lastUploadUrl === "string" && data.lastUploadUrl.length > 0
? data.lastUploadUrl
: null;
if (lastUploadUrl) {
return lastUploadUrl;
}
return resolveNodeImageUrl(node.data);
}
function resolveMixerHandleEdge(args: {
incomingEdges: readonly CanvasGraphEdgeLike[];
handle: "base" | "overlay";
}): CanvasGraphEdgeLike | null {
const filtered = args.incomingEdges.filter((edge) => {
if (args.handle === "base") {
return edge.targetHandle === "base" || edge.targetHandle == null || edge.targetHandle === "";
}
return edge.targetHandle === "overlay";
});
if (filtered.length !== 1) {
return null;
}
return filtered[0] ?? null;
}
function resolveMixerSourceUrlFromNode(args: {
node: CanvasGraphNodeLike;
graph: CanvasGraphSnapshot;
}): string | null {
if (!MIXER_SOURCE_NODE_TYPES.has(args.node.type)) {
return null;
}
if (args.node.type === "render") {
const preview = resolveRenderPreviewInputFromGraph({
nodeId: args.node.id,
graph: args.graph,
});
if (preview.sourceComposition) {
return null;
}
if (preview.sourceUrl) {
return preview.sourceUrl;
}
const directRenderUrl = resolveRenderOutputUrl(args.node);
if (directRenderUrl) {
return directRenderUrl;
}
return null;
}
return resolveNodeImageUrl(args.node.data);
}
function resolveMixerSourceUrlFromEdge(args: {
edge: CanvasGraphEdgeLike | null;
graph: CanvasGraphSnapshot;
}): string | null {
if (!args.edge) {
return null;
}
const sourceNode = args.graph.nodesById.get(args.edge.source);
if (!sourceNode) {
return null;
}
return resolveMixerSourceUrlFromNode({
node: sourceNode,
graph: args.graph,
});
}
function resolveRenderMixerCompositionFromGraph(args: {
node: CanvasGraphNodeLike;
graph: CanvasGraphSnapshot;
}): RenderPreviewSourceComposition | null {
const incomingEdges = args.graph.incomingEdgesByTarget.get(args.node.id) ?? [];
const baseEdge = resolveMixerHandleEdge({ incomingEdges, handle: "base" });
const overlayEdge = resolveMixerHandleEdge({ incomingEdges, handle: "overlay" });
const baseUrl = resolveMixerSourceUrlFromEdge({ edge: baseEdge, graph: args.graph });
const overlayUrl = resolveMixerSourceUrlFromEdge({ edge: overlayEdge, graph: args.graph });
if (!baseUrl || !overlayUrl) {
return null;
}
const data = (args.node.data ?? {}) as Record<string, unknown>;
const blendMode = MIXER_BLEND_MODES.has(data.blendMode as MixerBlendMode)
? (data.blendMode as MixerBlendMode)
: DEFAULT_BLEND_MODE;
return {
kind: "mixer",
baseUrl,
overlayUrl,
blendMode,
opacity: normalizeOpacity(data.opacity),
...normalizeMixerCompositionRect(data),
...normalizeMixerCompositionCropEdges(data),
};
}
export function buildGraphSnapshot( export function buildGraphSnapshot(
nodes: readonly CanvasGraphNodeLike[], nodes: readonly CanvasGraphNodeLike[],
edges: readonly CanvasGraphEdgeLike[], edges: readonly CanvasGraphEdgeLike[],
@@ -384,7 +704,32 @@ export function findSourceNodeFromGraph(
export function resolveRenderPreviewInputFromGraph(args: { export function resolveRenderPreviewInputFromGraph(args: {
nodeId: string; nodeId: string;
graph: CanvasGraphSnapshot; graph: CanvasGraphSnapshot;
}): { sourceUrl: string | null; steps: PipelineStep[] } { }): RenderPreviewResolvedInput {
const renderIncoming = getSortedIncomingEdge(
args.graph.incomingEdgesByTarget.get(args.nodeId),
);
const renderInputNode = renderIncoming
? args.graph.nodesById.get(renderIncoming.source)
: null;
if (renderInputNode?.type === "mixer") {
const sourceComposition = resolveRenderMixerCompositionFromGraph({
node: renderInputNode,
graph: args.graph,
});
const steps = collectPipelineFromGraph(args.graph, {
nodeId: args.nodeId,
isPipelineNode: (node) => RENDER_PREVIEW_PIPELINE_TYPES.has(node.type ?? ""),
});
return {
sourceUrl: null,
sourceComposition: sourceComposition ?? undefined,
steps,
};
}
const sourceUrl = getSourceImageFromGraph(args.graph, { const sourceUrl = getSourceImageFromGraph(args.graph, {
nodeId: args.nodeId, nodeId: args.nodeId,
isSourceNode: (node) => SOURCE_NODE_TYPES.has(node.type ?? ""), isSourceNode: (node) => SOURCE_NODE_TYPES.has(node.type ?? ""),
@@ -406,7 +751,7 @@ export function resolveRenderPreviewInput(args: {
nodeId: string; nodeId: string;
nodes: readonly RenderPreviewGraphNode[]; nodes: readonly RenderPreviewGraphNode[];
edges: readonly RenderPreviewGraphEdge[]; edges: readonly RenderPreviewGraphEdge[];
}): { sourceUrl: string | null; steps: PipelineStep[] } { }): RenderPreviewResolvedInput {
return resolveRenderPreviewInputFromGraph({ return resolveRenderPreviewInputFromGraph({
nodeId: args.nodeId, nodeId: args.nodeId,
graph: buildGraphSnapshot(args.nodes, args.edges), graph: buildGraphSnapshot(args.nodes, args.edges),

View File

@@ -200,6 +200,84 @@ export function canvasHandleAccentColorWithAlpha(
return `rgba(${r}, ${g}, ${b}, ${alpha})`; return `rgba(${r}, ${g}, ${b}, ${alpha})`;
} }
function clampUnit(value: number): number {
if (!Number.isFinite(value)) {
return 0;
}
if (value <= 0) {
return 0;
}
if (value >= 1) {
return 1;
}
return value;
}
function lerp(min: number, max: number, t: number): number {
return min + (max - min) * t;
}
export function canvasHandleGlowShadow(args: {
nodeType: string | undefined;
handleId?: string | null;
handleType: "source" | "target";
strength: number;
colorMode: EdgeGlowColorMode;
}): string | undefined {
const strength = clampUnit(args.strength);
if (strength <= 0) {
return undefined;
}
const [r, g, b] = canvasHandleAccentRgb(args);
const isDark = args.colorMode === "dark";
const ringAlpha = isDark
? lerp(0.08, 0.3, strength)
: lerp(0.06, 0.2, strength);
const glowAlpha = isDark
? lerp(0.12, 0.58, strength)
: lerp(0.08, 0.34, strength);
const ringSize = isDark
? lerp(1.8, 6.4, strength)
: lerp(1.5, 5.2, strength);
const glowSize = isDark
? lerp(4.5, 15, strength)
: lerp(3.5, 12, strength);
return `0 0 0 ${ringSize.toFixed(2)}px rgba(${r}, ${g}, ${b}, ${ringAlpha.toFixed(3)}), 0 0 ${glowSize.toFixed(2)}px rgba(${r}, ${g}, ${b}, ${glowAlpha.toFixed(3)})`;
}
export function connectionLineGlowFilter(args: {
nodeType: string | undefined;
handleId: string | null | undefined;
strength: number;
colorMode: EdgeGlowColorMode;
}): string | undefined {
const strength = clampUnit(args.strength);
if (strength <= 0) {
return undefined;
}
const [r, g, b] = connectionLineAccentRgb(args.nodeType, args.handleId);
const isDark = args.colorMode === "dark";
const innerAlpha = isDark
? lerp(0.22, 0.72, strength)
: lerp(0.12, 0.42, strength);
const outerAlpha = isDark
? lerp(0.12, 0.38, strength)
: lerp(0.06, 0.2, strength);
const innerBlur = isDark
? lerp(2.4, 4.2, strength)
: lerp(2, 3.4, strength);
const outerBlur = isDark
? lerp(5.4, 9.8, strength)
: lerp(4.6, 7.8, strength);
return `drop-shadow(0 0 ${innerBlur.toFixed(2)}px rgba(${r}, ${g}, ${b}, ${innerAlpha.toFixed(3)})) drop-shadow(0 0 ${outerBlur.toFixed(2)}px rgba(${r}, ${g}, ${b}, ${outerAlpha.toFixed(3)}))`;
}
/** /**
* RGB für die temporäre Verbindungslinie (Quell-Node + optional Handle, z. B. Reconnect). * RGB für die temporäre Verbindungslinie (Quell-Node + optional Handle, z. B. Reconnect).
*/ */
@@ -359,8 +437,14 @@ export const NODE_DEFAULTS: Record<
data: { data: {
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}, },
}, },
"agent-output": { "agent-output": {

View File

@@ -10,7 +10,7 @@ import {
applyGeometryStepsToSource, applyGeometryStepsToSource,
partitionPipelineSteps, partitionPipelineSteps,
} from "@/lib/image-pipeline/geometry-transform"; } from "@/lib/image-pipeline/geometry-transform";
import { loadSourceBitmap } from "@/lib/image-pipeline/source-loader"; import { loadRenderSourceBitmap } from "@/lib/image-pipeline/source-loader";
type SupportedCanvas = HTMLCanvasElement | OffscreenCanvas; type SupportedCanvas = HTMLCanvasElement | OffscreenCanvas;
type SupportedContext = CanvasRenderingContext2D | OffscreenCanvasRenderingContext2D; type SupportedContext = CanvasRenderingContext2D | OffscreenCanvasRenderingContext2D;
@@ -99,7 +99,11 @@ function resolveMimeType(format: RenderFormat): string {
export async function renderFull(options: RenderFullOptions): Promise<RenderFullResult> { export async function renderFull(options: RenderFullOptions): Promise<RenderFullResult> {
const { signal } = options; const { signal } = options;
const bitmap = await loadSourceBitmap(options.sourceUrl, { signal }); const bitmap = await loadRenderSourceBitmap({
sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
signal,
});
const { geometrySteps, tonalSteps } = partitionPipelineSteps(options.steps); const { geometrySteps, tonalSteps } = partitionPipelineSteps(options.steps);
const geometryResult = applyGeometryStepsToSource({ const geometryResult = applyGeometryStepsToSource({
source: bitmap, source: bitmap,

View File

@@ -2,21 +2,26 @@ import { renderFull } from "@/lib/image-pipeline/bridge";
import { renderPreview } from "@/lib/image-pipeline/preview-renderer"; import { renderPreview } from "@/lib/image-pipeline/preview-renderer";
import type { PipelineStep } from "@/lib/image-pipeline/contracts"; import type { PipelineStep } from "@/lib/image-pipeline/contracts";
import type { HistogramData } from "@/lib/image-pipeline/histogram"; import type { HistogramData } from "@/lib/image-pipeline/histogram";
import type { RenderFullOptions, RenderFullResult } from "@/lib/image-pipeline/render-types"; import type {
RenderFullOptions,
RenderFullResult,
RenderSourceComposition,
} from "@/lib/image-pipeline/render-types";
import { import {
IMAGE_PIPELINE_BACKEND_FLAG_KEYS, IMAGE_PIPELINE_BACKEND_FLAG_KEYS,
type BackendFeatureFlags, type BackendFeatureFlags,
} from "@/lib/image-pipeline/backend/feature-flags"; } from "@/lib/image-pipeline/backend/feature-flags";
type PreviewWorkerPayload = { type PreviewWorkerPayload = {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
featureFlags?: BackendFeatureFlags; featureFlags?: BackendFeatureFlags;
}; };
type FullWorkerPayload = RenderFullOptions & { type FullWorkerPayload = Omit<RenderFullOptions, "signal"> & {
featureFlags?: BackendFeatureFlags; featureFlags?: BackendFeatureFlags;
}; };
@@ -112,6 +117,7 @@ async function handlePreviewRequest(requestId: number, payload: PreviewWorkerPay
applyWorkerFeatureFlags(payload.featureFlags); applyWorkerFeatureFlags(payload.featureFlags);
const result = await renderPreview({ const result = await renderPreview({
sourceUrl: payload.sourceUrl, sourceUrl: payload.sourceUrl,
sourceComposition: payload.sourceComposition,
steps: payload.steps, steps: payload.steps,
previewWidth: payload.previewWidth, previewWidth: payload.previewWidth,
includeHistogram: payload.includeHistogram, includeHistogram: payload.includeHistogram,
@@ -161,6 +167,7 @@ async function handleFullRequest(requestId: number, payload: FullWorkerPayload):
applyWorkerFeatureFlags(payload.featureFlags); applyWorkerFeatureFlags(payload.featureFlags);
const result = await renderFull({ const result = await renderFull({
sourceUrl: payload.sourceUrl, sourceUrl: payload.sourceUrl,
sourceComposition: payload.sourceComposition,
steps: payload.steps, steps: payload.steps,
render: payload.render, render: payload.render,
signal: controller.signal, signal: controller.signal,

View File

@@ -8,7 +8,8 @@ import {
applyGeometryStepsToSource, applyGeometryStepsToSource,
partitionPipelineSteps, partitionPipelineSteps,
} from "@/lib/image-pipeline/geometry-transform"; } from "@/lib/image-pipeline/geometry-transform";
import { loadSourceBitmap } from "@/lib/image-pipeline/source-loader"; import { loadRenderSourceBitmap } from "@/lib/image-pipeline/source-loader";
import type { RenderSourceComposition } from "@/lib/image-pipeline/render-types";
export type PreviewRenderResult = { export type PreviewRenderResult = {
width: number; width: number;
@@ -64,13 +65,16 @@ async function yieldToMainOrWorkerLoop(): Promise<void> {
} }
export async function renderPreview(options: { export async function renderPreview(options: {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
signal?: AbortSignal; signal?: AbortSignal;
}): Promise<PreviewRenderResult> { }): Promise<PreviewRenderResult> {
const bitmap = await loadSourceBitmap(options.sourceUrl, { const bitmap = await loadRenderSourceBitmap({
sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
signal: options.signal, signal: options.signal,
}); });
const { geometrySteps, tonalSteps } = partitionPipelineSteps(options.steps); const { geometrySteps, tonalSteps } = partitionPipelineSteps(options.steps);

View File

@@ -24,6 +24,22 @@ export type RenderSizeLimits = {
maxPixels?: number; maxPixels?: number;
}; };
export type RenderSourceComposition = {
kind: "mixer";
baseUrl: string;
overlayUrl: string;
blendMode: "normal" | "multiply" | "screen" | "overlay";
opacity: number;
overlayX: number;
overlayY: number;
overlayWidth: number;
overlayHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
};
export type ResolvedRenderSize = { export type ResolvedRenderSize = {
width: number; width: number;
height: number; height: number;
@@ -32,7 +48,8 @@ export type ResolvedRenderSize = {
}; };
export type RenderFullOptions = { export type RenderFullOptions = {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
render: RenderOptions; render: RenderOptions;
limits?: RenderSizeLimits; limits?: RenderSizeLimits;

View File

@@ -1,3 +1,6 @@
import type { RenderSourceComposition } from "@/lib/image-pipeline/render-types";
import { computeVisibleMixerContentRect } from "@/lib/mixer-crop-layout";
export const SOURCE_BITMAP_CACHE_MAX_ENTRIES = 32; export const SOURCE_BITMAP_CACHE_MAX_ENTRIES = 32;
type CacheEntry = { type CacheEntry = {
@@ -12,6 +15,12 @@ type LoadSourceBitmapOptions = {
signal?: AbortSignal; signal?: AbortSignal;
}; };
type LoadRenderSourceBitmapOptions = {
sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
signal?: AbortSignal;
};
function throwIfAborted(signal: AbortSignal | undefined): void { function throwIfAborted(signal: AbortSignal | undefined): void {
if (signal?.aborted) { if (signal?.aborted) {
throw new DOMException("The operation was aborted.", "AbortError"); throw new DOMException("The operation was aborted.", "AbortError");
@@ -215,3 +224,219 @@ export async function loadSourceBitmap(
const promise = getOrCreateSourceBitmapPromise(sourceUrl); const promise = getOrCreateSourceBitmapPromise(sourceUrl);
return await awaitWithLocalAbort(promise, options.signal); return await awaitWithLocalAbort(promise, options.signal);
} }
function createWorkingCanvas(width: number, height: number):
| HTMLCanvasElement
| OffscreenCanvas {
if (typeof document !== "undefined") {
const canvas = document.createElement("canvas");
canvas.width = width;
canvas.height = height;
return canvas;
}
if (typeof OffscreenCanvas !== "undefined") {
return new OffscreenCanvas(width, height);
}
throw new Error("Canvas rendering is not available in this environment.");
}
function mixerBlendModeToCompositeOperation(
blendMode: RenderSourceComposition["blendMode"],
): GlobalCompositeOperation {
if (blendMode === "normal") {
return "source-over";
}
return blendMode;
}
function normalizeCompositionOpacity(value: number): number {
if (!Number.isFinite(value)) {
return 1;
}
return Math.max(0, Math.min(100, value)) / 100;
}
function normalizeRatio(value: number, fallback: number): number {
if (!Number.isFinite(value)) {
return fallback;
}
return value;
}
function normalizeMixerRect(source: RenderSourceComposition): {
x: number;
y: number;
width: number;
height: number;
} {
const overlayX = Math.max(0, Math.min(0.9, normalizeRatio(source.overlayX, 0)));
const overlayY = Math.max(0, Math.min(0.9, normalizeRatio(source.overlayY, 0)));
const overlayWidth = Math.max(
0.1,
Math.min(1, normalizeRatio(source.overlayWidth, 1), 1 - overlayX),
);
const overlayHeight = Math.max(
0.1,
Math.min(1, normalizeRatio(source.overlayHeight, 1), 1 - overlayY),
);
return {
x: overlayX,
y: overlayY,
width: overlayWidth,
height: overlayHeight,
};
}
function normalizeMixerCropEdges(source: RenderSourceComposition): {
left: number;
top: number;
right: number;
bottom: number;
} {
const legacySource = source as RenderSourceComposition & {
contentX?: number;
contentY?: number;
contentWidth?: number;
contentHeight?: number;
};
const hasLegacyContentRect =
legacySource.contentX !== undefined ||
legacySource.contentY !== undefined ||
legacySource.contentWidth !== undefined ||
legacySource.contentHeight !== undefined;
if (hasLegacyContentRect) {
const contentX = Math.max(
0,
Math.min(0.9, normalizeRatio(legacySource.contentX ?? Number.NaN, 0)),
);
const contentY = Math.max(
0,
Math.min(0.9, normalizeRatio(legacySource.contentY ?? Number.NaN, 0)),
);
const contentWidth = Math.max(
0.1,
Math.min(1, normalizeRatio(legacySource.contentWidth ?? Number.NaN, 1), 1 - contentX),
);
const contentHeight = Math.max(
0.1,
Math.min(1, normalizeRatio(legacySource.contentHeight ?? Number.NaN, 1), 1 - contentY),
);
return {
left: contentX,
top: contentY,
right: 1 - (contentX + contentWidth),
bottom: 1 - (contentY + contentHeight),
};
}
const cropLeft = Math.max(0, Math.min(0.9, normalizeRatio(source.cropLeft, 0)));
const cropTop = Math.max(0, Math.min(0.9, normalizeRatio(source.cropTop, 0)));
const cropRight = Math.max(0, Math.min(1 - cropLeft - 0.1, normalizeRatio(source.cropRight, 0)));
const cropBottom = Math.max(
0,
Math.min(1 - cropTop - 0.1, normalizeRatio(source.cropBottom, 0)),
);
return {
left: cropLeft,
top: cropTop,
right: cropRight,
bottom: cropBottom,
};
}
async function loadMixerCompositionBitmap(
sourceComposition: RenderSourceComposition,
signal?: AbortSignal,
): Promise<ImageBitmap> {
const [baseBitmap, overlayBitmap] = await Promise.all([
loadSourceBitmap(sourceComposition.baseUrl, { signal }),
loadSourceBitmap(sourceComposition.overlayUrl, { signal }),
]);
throwIfAborted(signal);
const canvas = createWorkingCanvas(baseBitmap.width, baseBitmap.height);
const context = canvas.getContext("2d", { willReadFrequently: true });
if (!context) {
throw new Error("Render composition could not create a 2D context.");
}
context.clearRect(0, 0, baseBitmap.width, baseBitmap.height);
context.drawImage(baseBitmap, 0, 0, baseBitmap.width, baseBitmap.height);
const rect = normalizeMixerRect(sourceComposition);
const frameX = rect.x * baseBitmap.width;
const frameY = rect.y * baseBitmap.height;
const frameWidth = rect.width * baseBitmap.width;
const frameHeight = rect.height * baseBitmap.height;
const cropEdges = normalizeMixerCropEdges(sourceComposition);
const sourceX = cropEdges.left * overlayBitmap.width;
const sourceY = cropEdges.top * overlayBitmap.height;
const sourceWidth = (1 - cropEdges.left - cropEdges.right) * overlayBitmap.width;
const sourceHeight = (1 - cropEdges.top - cropEdges.bottom) * overlayBitmap.height;
const visibleRect = computeVisibleMixerContentRect({
frameAspectRatio: frameHeight > 0 ? frameWidth / frameHeight : 1,
sourceWidth: overlayBitmap.width,
sourceHeight: overlayBitmap.height,
cropLeft: cropEdges.left,
cropTop: cropEdges.top,
cropRight: cropEdges.right,
cropBottom: cropEdges.bottom,
});
const destX = frameX + (visibleRect?.x ?? 0) * frameWidth;
const destY = frameY + (visibleRect?.y ?? 0) * frameHeight;
const destWidth = (visibleRect?.width ?? 1) * frameWidth;
const destHeight = (visibleRect?.height ?? 1) * frameHeight;
context.globalCompositeOperation = mixerBlendModeToCompositeOperation(
sourceComposition.blendMode,
);
context.globalAlpha = normalizeCompositionOpacity(sourceComposition.opacity);
context.save();
context.beginPath();
context.rect(frameX, frameY, frameWidth, frameHeight);
context.clip();
context.drawImage(
overlayBitmap,
sourceX,
sourceY,
sourceWidth,
sourceHeight,
destX,
destY,
destWidth,
destHeight,
);
context.restore();
context.globalCompositeOperation = "source-over";
context.globalAlpha = 1;
return await createImageBitmap(canvas);
}
export async function loadRenderSourceBitmap(
options: LoadRenderSourceBitmapOptions,
): Promise<ImageBitmap> {
if (options.sourceComposition) {
if (options.sourceComposition.kind !== "mixer") {
throw new Error(`Unsupported source composition '${options.sourceComposition.kind}'.`);
}
return await loadMixerCompositionBitmap(options.sourceComposition, options.signal);
}
if (!options.sourceUrl) {
throw new Error("Render source is required.");
}
return await loadSourceBitmap(options.sourceUrl, { signal: options.signal });
}

View File

@@ -5,7 +5,11 @@ import {
} from "@/lib/image-pipeline/preview-renderer"; } from "@/lib/image-pipeline/preview-renderer";
import { hashPipeline, type PipelineStep } from "@/lib/image-pipeline/contracts"; import { hashPipeline, type PipelineStep } from "@/lib/image-pipeline/contracts";
import type { HistogramData } from "@/lib/image-pipeline/histogram"; import type { HistogramData } from "@/lib/image-pipeline/histogram";
import type { RenderFullOptions, RenderFullResult } from "@/lib/image-pipeline/render-types"; import type {
RenderFullOptions,
RenderFullResult,
RenderSourceComposition,
} from "@/lib/image-pipeline/render-types";
import { import {
getBackendFeatureFlags, getBackendFeatureFlags,
type BackendFeatureFlags, type BackendFeatureFlags,
@@ -20,14 +24,15 @@ export type BackendDiagnosticsMetadata = {
}; };
type PreviewWorkerPayload = { type PreviewWorkerPayload = {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
featureFlags?: BackendFeatureFlags; featureFlags?: BackendFeatureFlags;
}; };
type FullWorkerPayload = RenderFullOptions & { type FullWorkerPayload = Omit<RenderFullOptions, "signal"> & {
featureFlags?: BackendFeatureFlags; featureFlags?: BackendFeatureFlags;
}; };
@@ -318,19 +323,20 @@ function runWorkerRequest<TResponse extends PreviewRenderResult | RenderFullResu
worker.postMessage({ worker.postMessage({
kind: "full", kind: "full",
requestId, requestId,
payload: args.payload as RenderFullOptions, payload: args.payload as FullWorkerPayload,
} satisfies WorkerRequestMessage); } satisfies WorkerRequestMessage);
}); });
} }
function getPreviewRequestKey(options: { function getPreviewRequestKey(options: {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
}): string { }): string {
return [ return [
hashPipeline(options.sourceUrl, options.steps), hashPipeline(options.sourceComposition ?? options.sourceUrl ?? null, options.steps),
options.previewWidth, options.previewWidth,
options.includeHistogram === true ? 1 : 0, options.includeHistogram === true ? 1 : 0,
].join(":"); ].join(":");
@@ -341,7 +347,8 @@ function getWorkerFeatureFlagsSnapshot(): BackendFeatureFlags {
} }
async function runPreviewRequest(options: { async function runPreviewRequest(options: {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
@@ -352,6 +359,7 @@ async function runPreviewRequest(options: {
kind: "preview", kind: "preview",
payload: { payload: {
sourceUrl: options.sourceUrl, sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
steps: options.steps, steps: options.steps,
previewWidth: options.previewWidth, previewWidth: options.previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
@@ -367,6 +375,7 @@ async function runPreviewRequest(options: {
if (!shouldFallbackToMainThread(error)) { if (!shouldFallbackToMainThread(error)) {
logWorkerClientDebug("preview request failed without fallback", { logWorkerClientDebug("preview request failed without fallback", {
sourceUrl: options.sourceUrl, sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
previewWidth: options.previewWidth, previewWidth: options.previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
diagnostics: getLastBackendDiagnostics(), diagnostics: getLastBackendDiagnostics(),
@@ -377,6 +386,7 @@ async function runPreviewRequest(options: {
logWorkerClientDebug("preview request falling back to main-thread", { logWorkerClientDebug("preview request falling back to main-thread", {
sourceUrl: options.sourceUrl, sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
previewWidth: options.previewWidth, previewWidth: options.previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
error, error,
@@ -387,7 +397,8 @@ async function runPreviewRequest(options: {
} }
function getOrCreateSharedPreviewRequest(options: { function getOrCreateSharedPreviewRequest(options: {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
@@ -419,7 +430,8 @@ function getOrCreateSharedPreviewRequest(options: {
} }
export async function renderPreviewWithWorkerFallback(options: { export async function renderPreviewWithWorkerFallback(options: {
sourceUrl: string; sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: readonly PipelineStep[]; steps: readonly PipelineStep[];
previewWidth: number; previewWidth: number;
includeHistogram?: boolean; includeHistogram?: boolean;
@@ -431,6 +443,7 @@ export async function renderPreviewWithWorkerFallback(options: {
const sharedRequest = getOrCreateSharedPreviewRequest({ const sharedRequest = getOrCreateSharedPreviewRequest({
sourceUrl: options.sourceUrl, sourceUrl: options.sourceUrl,
sourceComposition: options.sourceComposition,
steps: options.steps, steps: options.steps,
previewWidth: options.previewWidth, previewWidth: options.previewWidth,
includeHistogram: options.includeHistogram, includeHistogram: options.includeHistogram,
@@ -488,14 +501,16 @@ export async function renderPreviewWithWorkerFallback(options: {
export async function renderFullWithWorkerFallback( export async function renderFullWithWorkerFallback(
options: RenderFullOptions, options: RenderFullOptions,
): Promise<RenderFullResult> { ): Promise<RenderFullResult> {
const { signal, ...serializableOptions } = options;
try { try {
return await runWorkerRequest<RenderFullResult>({ return await runWorkerRequest<RenderFullResult>({
kind: "full", kind: "full",
payload: { payload: {
...options, ...serializableOptions,
featureFlags: getWorkerFeatureFlagsSnapshot(), featureFlags: getWorkerFeatureFlagsSnapshot(),
}, },
signal: options.signal, signal,
}); });
} catch (error: unknown) { } catch (error: unknown) {
if (isAbortError(error)) { if (isAbortError(error)) {

219
lib/mixer-crop-layout.ts Normal file
View File

@@ -0,0 +1,219 @@
const MIN_CROP_REMAINING_SIZE = 0.1;
type MixerSurfaceFit = "contain" | "cover";
function formatPercent(value: number): string {
const normalized = Math.abs(value) < 1e-10 ? 0 : value;
return `${normalized}%`;
}
function computeFittedRect(args: {
sourceWidth: number;
sourceHeight: number;
boundsX: number;
boundsY: number;
boundsWidth: number;
boundsHeight: number;
fit?: MixerSurfaceFit;
}): { x: number; y: number; width: number; height: number } {
const {
sourceWidth,
sourceHeight,
boundsX,
boundsY,
boundsWidth,
boundsHeight,
fit = "contain",
} = args;
if (sourceWidth <= 0 || sourceHeight <= 0 || boundsWidth <= 0 || boundsHeight <= 0) {
return {
x: boundsX,
y: boundsY,
width: boundsWidth,
height: boundsHeight,
};
}
const scale =
fit === "cover"
? Math.max(boundsWidth / sourceWidth, boundsHeight / sourceHeight)
: Math.min(boundsWidth / sourceWidth, boundsHeight / sourceHeight);
if (!Number.isFinite(scale) || scale <= 0) {
return {
x: boundsX,
y: boundsY,
width: boundsWidth,
height: boundsHeight,
};
}
const width = sourceWidth * scale;
const height = sourceHeight * scale;
return {
x: boundsX + (boundsWidth - width) / 2,
y: boundsY + (boundsHeight - height) / 2,
width,
height,
};
}
export function computeMixerFrameRectInSurface(args: {
surfaceWidth: number;
surfaceHeight: number;
baseWidth: number;
baseHeight: number;
overlayX: number;
overlayY: number;
overlayWidth: number;
overlayHeight: number;
fit?: MixerSurfaceFit;
}): { x: number; y: number; width: number; height: number } | null {
if (args.baseWidth <= 0 || args.baseHeight <= 0 || args.surfaceWidth <= 0 || args.surfaceHeight <= 0) {
return null;
}
const baseRect = computeFittedRect({
sourceWidth: args.baseWidth,
sourceHeight: args.baseHeight,
boundsX: 0,
boundsY: 0,
boundsWidth: args.surfaceWidth,
boundsHeight: args.surfaceHeight,
fit: args.fit,
});
return {
x: (baseRect.x + args.overlayX * baseRect.width) / args.surfaceWidth,
y: (baseRect.y + args.overlayY * baseRect.height) / args.surfaceHeight,
width: (args.overlayWidth * baseRect.width) / args.surfaceWidth,
height: (args.overlayHeight * baseRect.height) / args.surfaceHeight,
};
}
export function computeVisibleMixerContentRect(args: {
frameAspectRatio: number;
sourceWidth: number;
sourceHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
}): { x: number; y: number; width: number; height: number } | null {
if (args.sourceWidth <= 0 || args.sourceHeight <= 0) {
return null;
}
const cropWidth = Math.max(1 - args.cropLeft - args.cropRight, MIN_CROP_REMAINING_SIZE);
const cropHeight = Math.max(1 - args.cropTop - args.cropBottom, MIN_CROP_REMAINING_SIZE);
const frameAspectRatio = args.frameAspectRatio > 0 ? args.frameAspectRatio : 1;
const rect = computeFittedRect({
sourceWidth: args.sourceWidth * cropWidth,
sourceHeight: args.sourceHeight * cropHeight,
boundsX: 0,
boundsY: 0,
boundsWidth: frameAspectRatio,
boundsHeight: 1,
});
return {
x: rect.x / frameAspectRatio,
y: rect.y,
width: rect.width / frameAspectRatio,
height: rect.height,
};
}
export function computeMixerCropImageStyle(args: {
frameAspectRatio: number;
sourceWidth: number;
sourceHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
}) {
const safeWidth = Math.max(1 - args.cropLeft - args.cropRight, MIN_CROP_REMAINING_SIZE);
const safeHeight = Math.max(1 - args.cropTop - args.cropBottom, MIN_CROP_REMAINING_SIZE);
const visibleRect = computeVisibleMixerContentRect(args);
if (!visibleRect) {
return {
left: formatPercent((-args.cropLeft / safeWidth) * 100),
top: formatPercent((-args.cropTop / safeHeight) * 100),
width: formatPercent((1 / safeWidth) * 100),
height: formatPercent((1 / safeHeight) * 100),
} as const;
}
const imageWidth = visibleRect.width / safeWidth;
const imageHeight = visibleRect.height / safeHeight;
return {
left: formatPercent((visibleRect.x - (args.cropLeft / safeWidth) * visibleRect.width) * 100),
top: formatPercent((visibleRect.y - (args.cropTop / safeHeight) * visibleRect.height) * 100),
width: formatPercent(imageWidth * 100),
height: formatPercent(imageHeight * 100),
} as const;
}
export function computeMixerCompareOverlayImageStyle(args: {
surfaceWidth: number;
surfaceHeight: number;
baseWidth: number;
baseHeight: number;
overlayX: number;
overlayY: number;
overlayWidth: number;
overlayHeight: number;
sourceWidth: number;
sourceHeight: number;
cropLeft: number;
cropTop: number;
cropRight: number;
cropBottom: number;
}) {
const frameRect = computeMixerFrameRectInSurface({
surfaceWidth: args.surfaceWidth,
surfaceHeight: args.surfaceHeight,
baseWidth: args.baseWidth,
baseHeight: args.baseHeight,
overlayX: args.overlayX,
overlayY: args.overlayY,
overlayWidth: args.overlayWidth,
overlayHeight: args.overlayHeight,
});
const frameAspectRatio =
frameRect && frameRect.width > 0 && frameRect.height > 0
? (frameRect.width * args.surfaceWidth) / (frameRect.height * args.surfaceHeight)
: args.overlayWidth > 0 && args.overlayHeight > 0
? args.overlayWidth / args.overlayHeight
: 1;
return computeMixerCropImageStyle({
frameAspectRatio,
sourceWidth: args.sourceWidth,
sourceHeight: args.sourceHeight,
cropLeft: args.cropLeft,
cropTop: args.cropTop,
cropRight: args.cropRight,
cropBottom: args.cropBottom,
});
}
export function isMixerCropImageReady(args: {
currentOverlayUrl: string | null | undefined;
loadedOverlayUrl: string | null;
sourceWidth: number;
sourceHeight: number;
}): boolean {
return Boolean(
args.currentOverlayUrl &&
args.loadedOverlayUrl === args.currentOverlayUrl &&
args.sourceWidth > 0 &&
args.sourceHeight > 0,
);
}

View File

@@ -132,6 +132,16 @@ describe("canvas connection policy", () => {
).toBeNull(); ).toBeNull();
}); });
it("allows mixer as render source", () => {
expect(
validateCanvasConnectionPolicy({
sourceType: "mixer",
targetType: "render",
targetIncomingCount: 0,
}),
).toBeNull();
});
it("describes unsupported crop source message", () => { it("describes unsupported crop source message", () => {
expect(getCanvasConnectionValidationMessage("crop-source-invalid")).toBe( expect(getCanvasConnectionValidationMessage("crop-source-invalid")).toBe(
"Crop akzeptiert nur Bild-, Asset-, KI-Bild-, Video-, KI-Video-, Crop- oder Adjustment-Input.", "Crop akzeptiert nur Bild-, Asset-, KI-Bild-, Video-, KI-Video-, Crop- oder Adjustment-Input.",

View File

@@ -17,6 +17,13 @@ const sourceLoaderMocks = vi.hoisted(() => ({
vi.mock("@/lib/image-pipeline/source-loader", () => ({ vi.mock("@/lib/image-pipeline/source-loader", () => ({
loadSourceBitmap: sourceLoaderMocks.loadSourceBitmap, loadSourceBitmap: sourceLoaderMocks.loadSourceBitmap,
loadRenderSourceBitmap: ({ sourceUrl }: { sourceUrl?: string }) => {
if (!sourceUrl) {
throw new Error("Render source is required.");
}
return sourceLoaderMocks.loadSourceBitmap(sourceUrl);
},
})); }));
function createPreviewPixels(): Uint8ClampedArray { function createPreviewPixels(): Uint8ClampedArray {

View File

@@ -0,0 +1,117 @@
import { beforeEach, describe, expect, it, vi } from "vitest";
import type { RenderFullResult, RenderSourceComposition } from "@/lib/image-pipeline/render-types";
const bridgeMocks = vi.hoisted(() => ({
renderFull: vi.fn(),
}));
const previewRendererMocks = vi.hoisted(() => ({
renderPreview: vi.fn(),
}));
vi.mock("@/lib/image-pipeline/bridge", () => ({
renderFull: bridgeMocks.renderFull,
}));
vi.mock("@/lib/image-pipeline/preview-renderer", () => ({
renderPreview: previewRendererMocks.renderPreview,
}));
type WorkerMessage = {
kind: "full";
requestId: number;
payload: {
sourceUrl?: string;
sourceComposition?: RenderSourceComposition;
steps: [];
render: {
resolution: "original";
format: "png";
};
};
};
type WorkerScopeMock = {
postMessage: ReturnType<typeof vi.fn>;
onmessage: ((event: MessageEvent<WorkerMessage>) => void) | null;
};
function createFullResult(): RenderFullResult {
return {
blob: new Blob(["rendered"]),
width: 64,
height: 64,
mimeType: "image/png",
format: "png",
quality: null,
sizeBytes: 8,
sourceWidth: 64,
sourceHeight: 64,
wasSizeClamped: false,
};
}
function createWorkerScope(): WorkerScopeMock {
return {
postMessage: vi.fn(),
onmessage: null,
};
}
describe("image-pipeline.worker full render", () => {
beforeEach(() => {
vi.resetModules();
vi.unstubAllGlobals();
bridgeMocks.renderFull.mockReset();
bridgeMocks.renderFull.mockResolvedValue(createFullResult());
previewRendererMocks.renderPreview.mockReset();
});
it("forwards sourceComposition to renderFull for full requests", async () => {
const workerScope = createWorkerScope();
vi.stubGlobal("self", workerScope);
await import("@/lib/image-pipeline/image-pipeline.worker");
const sourceComposition: RenderSourceComposition = {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 0.5,
overlayX: 32,
overlayY: 16,
overlayWidth: 128,
overlayHeight: 64,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
};
workerScope.onmessage?.({
data: {
kind: "full",
requestId: 41,
payload: {
sourceComposition,
steps: [],
render: {
resolution: "original",
format: "png",
},
},
},
} as MessageEvent<WorkerMessage>);
await vi.waitFor(() => {
expect(bridgeMocks.renderFull).toHaveBeenCalledTimes(1);
});
expect(bridgeMocks.renderFull).toHaveBeenCalledWith(
expect.objectContaining({
sourceComposition,
}),
);
});
});

View File

@@ -355,4 +355,446 @@ describe("loadSourceBitmap", () => {
expect(createImageBitmap).toHaveBeenCalledWith(fakeVideo); expect(createImageBitmap).toHaveBeenCalledWith(fakeVideo);
expect(revokeObjectUrl).toHaveBeenCalledWith("blob:video-source"); expect(revokeObjectUrl).toHaveBeenCalledWith("blob:video-source");
}); });
it("renders non-square mixer overlays with contain-fit parity instead of stretching", async () => {
const baseBlob = new Blob(["base"]);
const overlayBlob = new Blob(["overlay"]);
const baseBitmap = { width: 100, height: 100 } as ImageBitmap;
const overlayBitmap = { width: 200, height: 100 } as ImageBitmap;
const composedBitmap = { width: 100, height: 100 } as ImageBitmap;
const drawImage = vi.fn();
const context = {
clearRect: vi.fn(),
drawImage,
save: vi.fn(),
restore: vi.fn(),
beginPath: vi.fn(),
rect: vi.fn(),
clip: vi.fn(),
globalCompositeOperation: "source-over" as GlobalCompositeOperation,
globalAlpha: 1,
};
const canvas = {
width: 0,
height: 0,
getContext: vi.fn().mockReturnValue(context),
} as unknown as HTMLCanvasElement;
const nativeCreateElement = document.createElement.bind(document);
vi.spyOn(document, "createElement").mockImplementation((tagName: string) => {
if (tagName.toLowerCase() === "canvas") {
return canvas;
}
return nativeCreateElement(tagName);
});
vi.stubGlobal(
"fetch",
vi.fn().mockImplementation(async (input: string | URL | Request) => {
const url = String(input);
if (url.includes("base.png")) {
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(baseBlob),
};
}
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(overlayBlob),
};
}),
);
vi.stubGlobal(
"createImageBitmap",
vi.fn().mockImplementation(async (input: unknown) => {
if (input === baseBlob) {
return baseBitmap;
}
if (input === overlayBlob) {
return overlayBitmap;
}
if (input === canvas) {
return composedBitmap;
}
throw new Error("Unexpected createImageBitmap input in mixer contain-fit test.");
}),
);
const { loadRenderSourceBitmap } = await importSubject();
await expect(
loadRenderSourceBitmap({
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 80,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.25,
overlayHeight: 0.5,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
},
}),
).resolves.toBe(composedBitmap);
expect(drawImage).toHaveBeenNthCalledWith(1, baseBitmap, 0, 0, 100, 100);
const overlayDrawArgs = drawImage.mock.calls[1];
expect(overlayDrawArgs?.[0]).toBe(overlayBitmap);
expect(overlayDrawArgs?.[1]).toBe(0);
expect(overlayDrawArgs?.[2]).toBe(0);
expect(overlayDrawArgs?.[3]).toBe(200);
expect(overlayDrawArgs?.[4]).toBe(100);
expect(overlayDrawArgs?.[5]).toBe(10);
expect(overlayDrawArgs?.[6]).toBeCloseTo(38.75, 10);
expect(overlayDrawArgs?.[7]).toBe(25);
expect(overlayDrawArgs?.[8]).toBeCloseTo(12.5, 10);
});
it("applies mixer crop framing by trimming source edges while leaving the displayed frame size untouched", async () => {
const baseBlob = new Blob(["base"]);
const overlayBlob = new Blob(["overlay"]);
const baseBitmap = { width: 100, height: 100 } as ImageBitmap;
const overlayBitmap = { width: 200, height: 100 } as ImageBitmap;
const composedBitmap = { width: 100, height: 100 } as ImageBitmap;
const drawImage = vi.fn();
const save = vi.fn();
const restore = vi.fn();
const beginPath = vi.fn();
const rect = vi.fn();
const clip = vi.fn();
const context = {
clearRect: vi.fn(),
drawImage,
save,
restore,
beginPath,
rect,
clip,
globalCompositeOperation: "source-over" as GlobalCompositeOperation,
globalAlpha: 1,
};
const canvas = {
width: 0,
height: 0,
getContext: vi.fn().mockReturnValue(context),
} as unknown as HTMLCanvasElement;
const nativeCreateElement = document.createElement.bind(document);
vi.spyOn(document, "createElement").mockImplementation((tagName: string) => {
if (tagName.toLowerCase() === "canvas") {
return canvas;
}
return nativeCreateElement(tagName);
});
vi.stubGlobal(
"fetch",
vi.fn().mockImplementation(async (input: string | URL | Request) => {
const url = String(input);
if (url.includes("base.png")) {
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(baseBlob),
};
}
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(overlayBlob),
};
}),
);
vi.stubGlobal(
"createImageBitmap",
vi.fn().mockImplementation(async (input: unknown) => {
if (input === baseBlob) {
return baseBitmap;
}
if (input === overlayBlob) {
return overlayBitmap;
}
if (input === canvas) {
return composedBitmap;
}
throw new Error("Unexpected createImageBitmap input in mixer content framing test.");
}),
);
const { loadRenderSourceBitmap } = await importSubject();
await expect(
loadRenderSourceBitmap({
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 80,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.4,
cropLeft: 0.5,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
},
}),
).resolves.toBe(composedBitmap);
expect(drawImage).toHaveBeenNthCalledWith(1, baseBitmap, 0, 0, 100, 100);
expect(save).toHaveBeenCalledTimes(1);
expect(beginPath).toHaveBeenCalledTimes(1);
expect(rect).toHaveBeenCalledWith(10, 20, 40, 40);
expect(clip).toHaveBeenCalledTimes(1);
expect(drawImage).toHaveBeenNthCalledWith(
2,
overlayBitmap,
100,
0,
100,
100,
10,
20,
40,
40,
);
expect(restore).toHaveBeenCalledTimes(1);
});
it("keeps overlayWidth and overlayHeight fixed while crop framing trims the sampled source region", async () => {
const baseBlob = new Blob(["base"]);
const overlayBlob = new Blob(["overlay"]);
const baseBitmap = { width: 100, height: 100 } as ImageBitmap;
const overlayBitmap = { width: 200, height: 100 } as ImageBitmap;
const composedBitmap = { width: 100, height: 100 } as ImageBitmap;
const drawImage = vi.fn();
const context = {
clearRect: vi.fn(),
drawImage,
save: vi.fn(),
restore: vi.fn(),
beginPath: vi.fn(),
rect: vi.fn(),
clip: vi.fn(),
globalCompositeOperation: "source-over" as GlobalCompositeOperation,
globalAlpha: 1,
};
const canvas = {
width: 0,
height: 0,
getContext: vi.fn().mockReturnValue(context),
} as unknown as HTMLCanvasElement;
const nativeCreateElement = document.createElement.bind(document);
vi.spyOn(document, "createElement").mockImplementation((tagName: string) => {
if (tagName.toLowerCase() === "canvas") {
return canvas;
}
return nativeCreateElement(tagName);
});
vi.stubGlobal(
"fetch",
vi.fn().mockImplementation(async (input: string | URL | Request) => {
const url = String(input);
if (url.includes("base.png")) {
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(baseBlob),
};
}
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(overlayBlob),
};
}),
);
vi.stubGlobal(
"createImageBitmap",
vi.fn().mockImplementation(async (input: unknown) => {
if (input === baseBlob) {
return baseBitmap;
}
if (input === overlayBlob) {
return overlayBitmap;
}
if (input === canvas) {
return composedBitmap;
}
throw new Error("Unexpected createImageBitmap input in overlay size preservation test.");
}),
);
const { loadRenderSourceBitmap } = await importSubject();
await expect(
loadRenderSourceBitmap({
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 80,
overlayX: 0.15,
overlayY: 0.25,
overlayWidth: 0.5,
overlayHeight: 0.3,
cropLeft: 0.25,
cropTop: 0.1,
cropRight: 0.25,
cropBottom: 0.3,
},
}),
).resolves.toBe(composedBitmap);
const overlayDrawArgs = drawImage.mock.calls[1];
expect(overlayDrawArgs?.[0]).toBe(overlayBitmap);
expect(overlayDrawArgs?.[1]).toBe(50);
expect(overlayDrawArgs?.[2]).toBe(10);
expect(overlayDrawArgs?.[3]).toBe(100);
expect(overlayDrawArgs?.[4]).toBeCloseTo(60, 10);
expect(overlayDrawArgs?.[5]).toBeCloseTo(15, 10);
expect(overlayDrawArgs?.[6]).toBeCloseTo(25, 10);
expect(overlayDrawArgs?.[7]).toBeCloseTo(50, 10);
expect(overlayDrawArgs?.[8]).toBeCloseTo(30, 10);
});
it("contains a cropped wide source within the overlay frame during bake", async () => {
const baseBlob = new Blob(["base"]);
const overlayBlob = new Blob(["overlay"]);
const baseBitmap = { width: 100, height: 100 } as ImageBitmap;
const overlayBitmap = { width: 200, height: 100 } as ImageBitmap;
const composedBitmap = { width: 100, height: 100 } as ImageBitmap;
const drawImage = vi.fn();
const context = {
clearRect: vi.fn(),
drawImage,
save: vi.fn(),
restore: vi.fn(),
beginPath: vi.fn(),
rect: vi.fn(),
clip: vi.fn(),
globalCompositeOperation: "source-over" as GlobalCompositeOperation,
globalAlpha: 1,
};
const canvas = {
width: 0,
height: 0,
getContext: vi.fn().mockReturnValue(context),
} as unknown as HTMLCanvasElement;
const nativeCreateElement = document.createElement.bind(document);
vi.spyOn(document, "createElement").mockImplementation((tagName: string) => {
if (tagName.toLowerCase() === "canvas") {
return canvas;
}
return nativeCreateElement(tagName);
});
vi.stubGlobal(
"fetch",
vi.fn().mockImplementation(async (input: string | URL | Request) => {
const url = String(input);
if (url.includes("base.png")) {
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(baseBlob),
};
}
return {
ok: true,
status: 200,
headers: { get: vi.fn().mockReturnValue("image/png") },
blob: vi.fn().mockResolvedValue(overlayBlob),
};
}),
);
vi.stubGlobal(
"createImageBitmap",
vi.fn().mockImplementation(async (input: unknown) => {
if (input === baseBlob) {
return baseBitmap;
}
if (input === overlayBlob) {
return overlayBitmap;
}
if (input === canvas) {
return composedBitmap;
}
throw new Error("Unexpected createImageBitmap input in aspect-aware crop bake test.");
}),
);
const { loadRenderSourceBitmap } = await importSubject();
await expect(
loadRenderSourceBitmap({
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 80,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.4,
cropLeft: 0,
cropTop: 0.25,
cropRight: 0,
cropBottom: 0.25,
},
}),
).resolves.toBe(composedBitmap);
const overlayDrawArgs = drawImage.mock.calls[1];
expect(overlayDrawArgs?.[0]).toBe(overlayBitmap);
expect(overlayDrawArgs?.[1]).toBe(0);
expect(overlayDrawArgs?.[2]).toBe(25);
expect(overlayDrawArgs?.[3]).toBe(200);
expect(overlayDrawArgs?.[4]).toBe(50);
expect(overlayDrawArgs?.[5]).toBe(10);
expect(overlayDrawArgs?.[6]).toBeCloseTo(35, 10);
expect(overlayDrawArgs?.[7]).toBe(40);
expect(overlayDrawArgs?.[8]).toBeCloseTo(10, 10);
});
}); });

View File

@@ -341,6 +341,7 @@ describe("webgl backend poc", () => {
vi.doMock("@/lib/image-pipeline/source-loader", () => ({ vi.doMock("@/lib/image-pipeline/source-loader", () => ({
loadSourceBitmap: vi.fn().mockResolvedValue({ width: 2, height: 2 }), loadSourceBitmap: vi.fn().mockResolvedValue({ width: 2, height: 2 }),
loadRenderSourceBitmap: vi.fn().mockResolvedValue({ width: 2, height: 2 }),
})); }));
vi.spyOn(HTMLCanvasElement.prototype, "getContext").mockReturnValue({ vi.spyOn(HTMLCanvasElement.prototype, "getContext").mockReturnValue({

View File

@@ -4,7 +4,7 @@ import { buildGraphSnapshot } from "@/lib/canvas-render-preview";
import { resolveMixerPreviewFromGraph } from "@/lib/canvas-mixer-preview"; import { resolveMixerPreviewFromGraph } from "@/lib/canvas-mixer-preview";
describe("resolveMixerPreviewFromGraph", () => { describe("resolveMixerPreviewFromGraph", () => {
it("resolves base and overlay URLs by target handle", () => { it("resolves base and overlay URLs by target handle while keeping frame and crop trims independent", () => {
const graph = buildGraphSnapshot( const graph = buildGraphSnapshot(
[ [
{ {
@@ -25,7 +25,18 @@ describe("resolveMixerPreviewFromGraph", () => {
{ {
id: "mixer-1", id: "mixer-1",
type: "mixer", type: "mixer",
data: { blendMode: "screen", opacity: 70, offsetX: 12, offsetY: -8 }, data: {
blendMode: "screen",
opacity: 70,
overlayX: 0.12,
overlayY: 0.2,
overlayWidth: 0.6,
overlayHeight: 0.5,
cropLeft: 0.08,
cropTop: 0.15,
cropRight: 0.22,
cropBottom: 0.1,
},
}, },
], ],
[ [
@@ -41,12 +52,114 @@ describe("resolveMixerPreviewFromGraph", () => {
overlayUrl: "https://cdn.example.com/overlay.png", overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "screen", blendMode: "screen",
opacity: 70, opacity: 70,
offsetX: 12, overlayX: 0.12,
offsetY: -8, overlayY: 0.2,
overlayWidth: 0.6,
overlayHeight: 0.5,
cropLeft: 0.08,
cropTop: 0.15,
cropRight: 0.22,
cropBottom: 0.1,
}); });
}); });
it("prefers render output URL over upstream preview source when available", () => { it("preserves crop trims when frame resize data changes", () => {
const graph = buildGraphSnapshot(
[
{
id: "image-base",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
overlayX: 0.2,
overlayY: 0.1,
overlayWidth: 0.6,
overlayHeight: 0.3,
cropLeft: 0.15,
cropTop: 0.05,
cropRight: 0.4,
cropBottom: 0.25,
},
},
],
[
{ source: "image-base", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-asset", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual(
expect.objectContaining({
overlayX: 0.2,
overlayY: 0.1,
overlayWidth: 0.6,
overlayHeight: 0.3,
cropLeft: 0.15,
cropTop: 0.05,
cropRight: 0.4,
cropBottom: 0.25,
}),
);
});
it("preserves overlayWidth and overlayHeight when crop trims change", () => {
const graph = buildGraphSnapshot(
[
{
id: "image-base",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
overlayX: 0.05,
overlayY: 0.25,
overlayWidth: 0.55,
overlayHeight: 0.35,
cropLeft: 0.4,
cropTop: 0.1,
cropRight: 0.3,
cropBottom: 0.1,
},
},
],
[
{ source: "image-base", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-asset", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual(
expect.objectContaining({
overlayX: 0.05,
overlayY: 0.25,
overlayWidth: 0.55,
overlayHeight: 0.35,
cropLeft: 0.4,
cropTop: 0.1,
cropRight: 0.3,
cropBottom: 0.1,
}),
);
});
it("prefers live render preview URL over stale baked render output", () => {
const graph = buildGraphSnapshot( const graph = buildGraphSnapshot(
[ [
{ {
@@ -82,11 +195,79 @@ describe("resolveMixerPreviewFromGraph", () => {
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({ expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({
status: "ready", status: "ready",
baseUrl: "https://cdn.example.com/base.png", baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/render-output.png", overlayUrl: "https://cdn.example.com/upstream.png",
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
});
});
it("does not reuse stale baked render output when only live sourceComposition exists", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-base",
type: "image",
data: { url: "https://cdn.example.com/overlay-base.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay-asset.png" },
},
{
id: "upstream-mixer",
type: "mixer",
data: {},
},
{
id: "render-overlay",
type: "render",
data: {
lastUploadUrl: "https://cdn.example.com/stale-render-output.png",
},
},
{
id: "mixer-1",
type: "mixer",
data: {},
},
],
[
{ source: "overlay-base", target: "upstream-mixer", targetHandle: "base" },
{ source: "overlay-asset", target: "upstream-mixer", targetHandle: "overlay" },
{ source: "upstream-mixer", target: "render-overlay" },
{ source: "base-image", target: "mixer-1", targetHandle: "base" },
{ source: "render-overlay", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({
status: "partial",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: undefined,
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}); });
}); });
@@ -113,12 +294,18 @@ describe("resolveMixerPreviewFromGraph", () => {
overlayUrl: undefined, overlayUrl: undefined,
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}); });
}); });
it("normalizes blend mode and clamps numeric values", () => { it("normalizes crop trims and clamps", () => {
const graph = buildGraphSnapshot( const graph = buildGraphSnapshot(
[ [
{ {
@@ -137,8 +324,14 @@ describe("resolveMixerPreviewFromGraph", () => {
data: { data: {
blendMode: "unknown", blendMode: "unknown",
opacity: 180, opacity: 180,
offsetX: 9999, overlayX: -3,
offsetY: "-9999", overlayY: "1.4",
overlayWidth: 2,
overlayHeight: 0,
cropLeft: "0.95",
cropTop: -2,
cropRight: "4",
cropBottom: "0",
}, },
}, },
], ],
@@ -154,8 +347,151 @@ describe("resolveMixerPreviewFromGraph", () => {
overlayUrl: "https://cdn.example.com/overlay-asset.png", overlayUrl: "https://cdn.example.com/overlay-asset.png",
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 2048, overlayX: 0,
offsetY: -2048, overlayY: 0.9,
overlayWidth: 1,
overlayHeight: 0.1,
cropLeft: 0.9,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
});
});
it("missing rect fields fallback to sensible defaults", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-ai",
type: "ai-image",
data: { url: "https://cdn.example.com/base-ai.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay-asset.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
blendMode: "multiply",
opacity: 42,
},
},
],
[
{ source: "base-ai", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-asset", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({
status: "ready",
baseUrl: "https://cdn.example.com/base-ai.png",
overlayUrl: "https://cdn.example.com/overlay-asset.png",
blendMode: "multiply",
opacity: 42,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
});
});
it("maps legacy content rect fields into crop trims during normalization", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-ai",
type: "ai-image",
data: { url: "https://cdn.example.com/base-ai.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay-asset.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
contentX: 0.2,
contentY: 0.1,
contentWidth: 0.5,
contentHeight: 0.6,
},
},
],
[
{ source: "base-ai", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-asset", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({
status: "ready",
baseUrl: "https://cdn.example.com/base-ai.png",
overlayUrl: "https://cdn.example.com/overlay-asset.png",
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0.2,
cropTop: 0.1,
cropRight: 0.30000000000000004,
cropBottom: 0.30000000000000004,
});
});
it("legacy offset fields still yield visible overlay geometry", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-ai",
type: "ai-image",
data: { url: "https://cdn.example.com/base-ai.png" },
},
{
id: "overlay-asset",
type: "asset",
data: { url: "https://cdn.example.com/overlay-asset.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
offsetX: 100,
offsetY: -40,
},
},
],
[
{ source: "base-ai", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-asset", target: "mixer-1", targetHandle: "overlay" },
],
);
expect(resolveMixerPreviewFromGraph({ nodeId: "mixer-1", graph })).toEqual({
status: "ready",
baseUrl: "https://cdn.example.com/base-ai.png",
overlayUrl: "https://cdn.example.com/overlay-asset.png",
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}); });
}); });
@@ -190,8 +526,14 @@ describe("resolveMixerPreviewFromGraph", () => {
overlayUrl: undefined, overlayUrl: undefined,
blendMode: "normal", blendMode: "normal",
opacity: 100, opacity: 100,
offsetX: 0, overlayX: 0,
offsetY: 0, overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
error: "duplicate-handle-edge", error: "duplicate-handle-edge",
}); });
}); });

View File

@@ -4,8 +4,147 @@ import {
buildGraphSnapshot, buildGraphSnapshot,
resolveRenderPreviewInputFromGraph, resolveRenderPreviewInputFromGraph,
} from "@/lib/canvas-render-preview"; } from "@/lib/canvas-render-preview";
import {
computeMixerCompareOverlayImageStyle,
computeMixerFrameRectInSurface,
computeVisibleMixerContentRect,
computeMixerCropImageStyle,
isMixerCropImageReady,
} from "@/lib/mixer-crop-layout";
describe("resolveRenderPreviewInputFromGraph", () => { describe("resolveRenderPreviewInputFromGraph", () => {
it("resolves mixer input as renderable mixer composition", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-image",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
blendMode: "overlay",
opacity: 76,
overlayX: 0.2,
overlayY: 0.1,
overlayWidth: 0.55,
overlayHeight: 0.44,
cropLeft: 0.08,
cropTop: 0.15,
cropRight: 0.22,
cropBottom: 0.1,
},
},
{
id: "render-1",
type: "render",
data: {},
},
],
[
{ source: "base-image", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-image", target: "mixer-1", targetHandle: "overlay" },
{ source: "mixer-1", target: "render-1" },
],
);
const preview = resolveRenderPreviewInputFromGraph({
nodeId: "render-1",
graph,
});
expect(preview).toEqual({
sourceUrl: null,
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "overlay",
opacity: 76,
overlayX: 0.2,
overlayY: 0.1,
overlayWidth: 0.55,
overlayHeight: 0.44,
cropLeft: 0.08,
cropTop: 0.15,
cropRight: 0.22,
cropBottom: 0.1,
},
steps: [],
});
});
it("normalizes mixer composition values for render input", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-image",
type: "asset",
data: { url: "https://cdn.example.com/overlay.png" },
},
{
id: "mixer-1",
type: "mixer",
data: {
blendMode: "unknown",
opacity: 180,
overlayX: -3,
overlayY: "1.4",
overlayWidth: 2,
overlayHeight: 0,
cropLeft: "0.95",
cropTop: -2,
cropRight: "4",
cropBottom: "0",
},
},
{
id: "render-1",
type: "render",
data: {},
},
],
[
{ source: "base-image", target: "mixer-1", targetHandle: "base" },
{ source: "overlay-image", target: "mixer-1", targetHandle: "overlay" },
{ source: "mixer-1", target: "render-1" },
],
);
const preview = resolveRenderPreviewInputFromGraph({
nodeId: "render-1",
graph,
});
expect(preview.sourceComposition).toEqual({
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/overlay.png",
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0.9,
overlayWidth: 1,
overlayHeight: 0.1,
cropLeft: 0.9,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
});
});
it("includes crop in collected pipeline steps", () => { it("includes crop in collected pipeline steps", () => {
const graph = buildGraphSnapshot( const graph = buildGraphSnapshot(
[ [
@@ -88,5 +227,191 @@ describe("resolveRenderPreviewInputFromGraph", () => {
const preview = resolveRenderPreviewInputFromGraph({ nodeId: "render-1", graph }); const preview = resolveRenderPreviewInputFromGraph({ nodeId: "render-1", graph });
expect(preview.sourceUrl).toBe("https://cdn.example.com/generated-video.mp4"); expect(preview.sourceUrl).toBe("https://cdn.example.com/generated-video.mp4");
expect(preview.sourceComposition).toBeUndefined();
});
it("prefers live render preview URLs over stale baked render URLs inside downstream mixer compositions", () => {
const graph = buildGraphSnapshot(
[
{
id: "base-image",
type: "image",
data: { url: "https://cdn.example.com/base.png" },
},
{
id: "overlay-upstream",
type: "image",
data: { url: "https://cdn.example.com/upstream.png" },
},
{
id: "render-overlay",
type: "render",
data: {
lastUploadUrl: "https://cdn.example.com/stale-render-output.png",
},
},
{
id: "mixer-1",
type: "mixer",
data: {},
},
{
id: "render-2",
type: "render",
data: {},
},
],
[
{ source: "overlay-upstream", target: "render-overlay" },
{ source: "base-image", target: "mixer-1", targetHandle: "base" },
{ source: "render-overlay", target: "mixer-1", targetHandle: "overlay" },
{ source: "mixer-1", target: "render-2" },
],
);
const preview = resolveRenderPreviewInputFromGraph({ nodeId: "render-2", graph });
expect(preview).toEqual({
sourceUrl: null,
sourceComposition: {
kind: "mixer",
baseUrl: "https://cdn.example.com/base.png",
overlayUrl: "https://cdn.example.com/upstream.png",
blendMode: "normal",
opacity: 100,
overlayX: 0,
overlayY: 0,
overlayWidth: 1,
overlayHeight: 1,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
},
steps: [],
});
});
});
describe("mixer crop layout parity", () => {
it("contains a wide cropped source inside a square overlay frame", () => {
expect(
computeVisibleMixerContentRect({
frameAspectRatio: 1,
sourceWidth: 200,
sourceHeight: 100,
cropLeft: 0,
cropTop: 0.25,
cropRight: 0,
cropBottom: 0.25,
}),
).toEqual({
x: 0,
y: 0.375,
width: 1,
height: 0.25,
});
});
it("returns compare image styles that letterbox instead of stretching", () => {
expect(
computeMixerCropImageStyle({
frameAspectRatio: 1,
sourceWidth: 200,
sourceHeight: 100,
cropLeft: 0,
cropTop: 0,
cropRight: 0,
cropBottom: 0,
}),
).toEqual({
left: "0%",
top: "25%",
width: "100%",
height: "50%",
});
});
it("uses the actual base-aware frame pixel ratio for compare crop math", () => {
expect(
computeMixerCompareOverlayImageStyle({
surfaceWidth: 500,
surfaceHeight: 380,
baseWidth: 200,
baseHeight: 100,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.4,
sourceWidth: 200,
sourceHeight: 100,
cropLeft: 0.1,
cropTop: 0,
cropRight: 0.1,
cropBottom: 0,
}),
).toEqual({
left: "0%",
top: "0%",
width: "100%",
height: "100%",
});
});
it("does not mark compare crop overlay ready before natural size is known", () => {
expect(
isMixerCropImageReady({
currentOverlayUrl: "https://cdn.example.com/overlay-a.png",
loadedOverlayUrl: null,
sourceWidth: 0,
sourceHeight: 0,
}),
).toBe(false);
});
it("invalidates compare crop overlay readiness on source swap until the new image loads", () => {
expect(
isMixerCropImageReady({
currentOverlayUrl: "https://cdn.example.com/overlay-b.png",
loadedOverlayUrl: "https://cdn.example.com/overlay-a.png",
sourceWidth: 200,
sourceHeight: 100,
}),
).toBe(false);
});
it("positions mixer overlay frame relative to the displayed base-image rect", () => {
expect(
computeMixerFrameRectInSurface({
surfaceWidth: 1,
surfaceHeight: 1,
baseWidth: 200,
baseHeight: 100,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.4,
}),
).toEqual({
x: 0.1,
y: 0.35,
width: 0.4,
height: 0.2,
});
});
it("returns null frame placement until base image natural size is known", () => {
expect(
computeMixerFrameRectInSurface({
surfaceWidth: 1,
surfaceHeight: 1,
baseWidth: 0,
baseHeight: 0,
overlayX: 0.1,
overlayY: 0.2,
overlayWidth: 0.4,
overlayHeight: 0.4,
}),
).toBeNull();
}); });
}); });

View File

@@ -32,6 +32,13 @@ vi.mock("@/lib/image-pipeline/render-core", () => ({
vi.mock("@/lib/image-pipeline/source-loader", () => ({ vi.mock("@/lib/image-pipeline/source-loader", () => ({
loadSourceBitmap: sourceLoaderMocks.loadSourceBitmap, loadSourceBitmap: sourceLoaderMocks.loadSourceBitmap,
loadRenderSourceBitmap: ({ sourceUrl }: { sourceUrl?: string }) => {
if (!sourceUrl) {
throw new Error("Render source is required.");
}
return sourceLoaderMocks.loadSourceBitmap(sourceUrl);
},
})); }));
describe("preview-renderer cancellation", () => { describe("preview-renderer cancellation", () => {

View File

@@ -199,6 +199,48 @@ describe("worker-client fallbacks", () => {
expect(bridgeMocks.renderFull).not.toHaveBeenCalled(); expect(bridgeMocks.renderFull).not.toHaveBeenCalled();
}); });
it("does not include AbortSignal in full worker payload serialization", async () => {
const workerMessages: WorkerMessage[] = [];
FakeWorker.behavior = (worker, message) => {
workerMessages.push(message);
if (message.kind !== "full") {
return;
}
queueMicrotask(() => {
worker.onmessage?.({
data: {
kind: "full-result",
requestId: message.requestId,
payload: createFullResult(),
},
} as MessageEvent);
});
};
vi.stubGlobal("Worker", FakeWorker as unknown as typeof Worker);
const { renderFullWithWorkerFallback } = await import("@/lib/image-pipeline/worker-client");
await renderFullWithWorkerFallback({
sourceUrl: "https://cdn.example.com/source.png",
steps: [],
render: {
resolution: "original",
format: "png",
},
signal: new AbortController().signal,
});
const fullMessage = workerMessages.find((message) => message.kind === "full") as
| (WorkerMessage & {
payload?: Record<string, unknown>;
})
| undefined;
expect(fullMessage).toBeDefined();
expect(fullMessage?.payload).not.toHaveProperty("signal");
});
it("still falls back to the main thread when the Worker API is unavailable", async () => { it("still falls back to the main thread when the Worker API is unavailable", async () => {
vi.stubGlobal("Worker", undefined); vi.stubGlobal("Worker", undefined);

View File

@@ -32,6 +32,9 @@ export default defineConfig({
"components/canvas/__tests__/use-node-local-data.test.tsx", "components/canvas/__tests__/use-node-local-data.test.tsx",
"components/canvas/__tests__/use-canvas-sync-engine.test.ts", "components/canvas/__tests__/use-canvas-sync-engine.test.ts",
"components/canvas/__tests__/use-canvas-sync-engine-hook.test.tsx", "components/canvas/__tests__/use-canvas-sync-engine-hook.test.tsx",
"components/canvas/__tests__/canvas-sidebar.test.tsx",
"components/canvas/__tests__/canvas-toolbar.test.tsx",
"components/canvas/__tests__/canvas-favorites-visibility.test.ts",
"components/canvas/__tests__/asset-browser-panel.test.tsx", "components/canvas/__tests__/asset-browser-panel.test.tsx",
"components/canvas/__tests__/video-browser-panel.test.tsx", "components/canvas/__tests__/video-browser-panel.test.tsx",
"components/media/__tests__/media-preview-utils.test.ts", "components/media/__tests__/media-preview-utils.test.ts",