Advertorial - DIGITAL PRODUCTION https://digitalproduction.com Magazine for Digital Media Production Thu, 18 Dec 2025 11:08:54 +0000 en-US hourly 1 236729828 Samsung Onyx: How modern display technology is redefining the cinema experience https://digitalproduction.com/2025/12/18/how-modern-display-technology-is-redefining-the-cinema-experience-samsung-onyx/ Thu, 18 Dec 2025 11:02:47 +0000 https://digitalproduction.com/?p=241521 Two people sitting in modern chairs, facing a large, vibrant digital display. The screen features an artistic representation of an eye surrounded by lush greenery and abstract elements, creating a captivating visual experience.

(Advertorial) Projection is out, pixels are in: Samsung’s Onyx Cinema LED aims to replace cinema projectors with HDR-capable LED walls and flexible sizing.

The post Samsung Onyx: How modern display technology is redefining the cinema experience first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
Two people sitting in modern chairs, facing a large, vibrant digital display. The screen features an artistic representation of an eye surrounded by lush greenery and abstract elements, creating a captivating visual experience.

Audience expectations of the cinema experience have evolved significantly in recent years. Today, viewers expect image quality that aligns with contemporary production and post-production standards – with accurate color reproduction, high contrast and visual consistency that preserves creative intent. At the same time, cinema operators are challenged to future-proof their technical infrastructure while integrating formats such as HDR in a meaningful and sustainable way. This is where Samsung Onyx comes in.

One response to these requirements is LED-based cinema display technology such as the Samsung Onyx Cinema LED display (ICD series). Designed specifically for cinema environments, the system introduces new possibilities for image reproduction, presentation quality and operational consistency.

LED technology as an alternative to traditional projection

Unlike conventional projection systems, Samsung Onyx relies on self-emissive LED technology. The display supports resolutions up to 4K with a refresh rate of 120 Hz[1] and achieves peak brightness levels of up to 300 nits[2]. This enables bright image areas to remain clearly visible without color washout or loss of detail.

Deep blacks, an almost infinite contrast ratio, and high color accuracy enable nuanced image reproduction across the full brightness range. Especially in high-contrast scenes, this results in more precise detail reproduction – an aspect that is increasingly relevant for both filmmakers and audiences.

Two people seated in plush chairs, facing a large screen displaying a vibrant nighttime scene with a crescent moon and a stylized star explosion over a quiet street lined with illuminated buildings.

HDR in cinema: consistent implementation of modern workflows

As HDR content becomes more widespread, its reliable presentation in cinemas is gaining importance. Modern cinema displays must not only support extended brightness and color spaces technically but also reproduce them consistently in real-world screening environments. Systems such as Samsung Onyx are increasingly designed to support HDR workflows from mastering through to on-screen playback.

This approach helps ensure that creative decisions made during production and post-production remain visible in the cinema environment, strengthening the connection between artistic intent and the final audience experience.

Flexibility for different auditorium layouts

In addition to image quality, flexibility is a key consideration for cinema operators. Samsung Onyx is available in four standard screen sizes – 5, 10, 14 and 20 meters – and can be scaled beyond these formats to accommodate a wide range of auditorium dimensions.[3] This adaptability allows cinemas to optimize screen size without compromising visual performance.

The system is also compatible with established cinema audio solutions such as Dolby Atmos, Meyer Sound, QSC and JBL, enabling integration into existing sound infrastructures.

Digital experiences beyond the auditorium

Modern display technologies are not limited to the screening room itself. Digital signage solutions enable cinemas to enhance visual communication in lobby and service areas. Energy-efficient color e-paper displays[4] can be used to present programmed information or advertising content and are centrally managed via the Samsung VXT device and content management platform.

Additional Smart Signage displays can be deployed for menu boards, trailers or brand communication, creating a consistent visual experience throughout the entire cinema journey – from arrival to the start of the film.

Display technology is increasingly shaping the way audiences experience cinematic stories. As production standards continue to evolve, solutions like Samsung Onyx illustrate how image reproduction, brightness and color accuracy can be aligned more closely with creative intent. For cinemas, this opens new ways to present content in a technically consistent and visually compelling manner – supporting the medium of cinema as it adapts to changing audience expectations.[5]


[1] Based on the internal data bandwidth of the screen. Actual frame rates may vary depending on the connected IMB.

[2] Peak brightness is supported when using DCI-HDR-supported IMB.

[3] All measurements in metres refer to the screen width, while all measurements in inches refer to the diagonal. The 10-metre Onyx screen is available now, with the remaining models being introduced gradually.

[4] Thanks to advanced technology, color e-paper consumes significantly less energy than many other digital signage devices, especially when displaying static images. This can result in direct cost savings. The 4,600 mAH battery also offers high energy efficiency.

[5] The quality of film screenings may vary depending on the region and cinema.

The post Samsung Onyx: How modern display technology is redefining the cinema experience first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
storyinpraha 241521
Need an AI Video Enhancer in Your Workflow? Aiarty Video Enhancer! https://digitalproduction.com/2025/10/09/aiarty-video-enhancer-ai-video-enhancer-in-your-workflow/ Thu, 09 Oct 2025 06:00:00 +0000 https://digitalproduction.com/?p=205595 An illustration promoting Aiarty Video Enhancer AI, featuring a vivid hummingbird in mid-flight against a blurred background. The display shows video resolutions of 480P, 720P, and 4K Ultra HD, with a 'Turbo ON/OFF' toggle.

Blocky VHS? Grainy DSLR shots? Aiarty Video Enhancer upscales, denoises, and restores video locally—no cloud upload required. (Sponsored Post)

The post Need an AI Video Enhancer in Your Workflow? Aiarty Video Enhancer! first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
An illustration promoting Aiarty Video Enhancer AI, featuring a vivid hummingbird in mid-flight against a blurred background. The display shows video resolutions of 480P, 720P, and 4K Ultra HD, with a 'Turbo ON/OFF' toggle.

Many filmmakers, editors, or archivists have faced the same problems: low-resolution footage that looks blocky on modern displays, noisy shots captured in poor light, or old tapes and compressed files that feel unusable in today's production standards. Manually fixing these flaws takes hours - time that could be spent on editing and storytelling instead of patching technical problems. This is where Aiarty Video Enhancer fits in. Built for professionals who need reliable results, Aiarty uses carefully trained AI models to restore detail, upscale resolution, denoise footage, and smooth motion, all while keeping natural textures intact. Whether you're trying to unify mixed resolutions, revive digitized VHS or MiniDV archives, prepare client deliverables for 4K playback, or give YouTube content a professional polish, Aiarty helps you achieve more consistent quality with less effort. What Is Aiarty Video Enhancer Aiarty Video Enhancer is a desktop-based AI video enhancer and upscaler tool, ...


Hello Stranger!

This article is exclusively for Digital Production Subscribers.
If you are already subscribed, please log in below,
if you aren't subscribed, What are you waiting for?

Subscribers get
exclusive access to many articles like the one you just wanted to read,
can directly contact the authors or the newsroom,
can download many cool things from the archives,
support one of the last independent platforms weithout an "Algorithm"
and are granted exclusive bragging rights for being a better person!

Get an overview on what's available here, and access everything on the site!


Subscribe Now!

If you need another reason,
here is a picture of the editorial cat,
which you'll be supporting as well!


The post Need an AI Video Enhancer in Your Workflow? Aiarty Video Enhancer! first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
DIGITAL PRODUCTION 205595
Workstations for Virtual Production https://digitalproduction.com/2025/08/08/workstations-for-virtual-production/ Fri, 08 Aug 2025 10:00:00 +0000 https://digitalproduction.com/?p=193471 A virtual production setup featuring a model spacecraft on a desert-like terrain, with labeled components including cameras, workstations, and technology from Dell. Two operators are seated at computers with monitors.

Virtual production wouldn’t be possible without high-performance workstations. These systems deliver the computing power required for rendering, simulations, and complex visual effects directly within the studio environment.

The post Workstations for Virtual Production first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
A virtual production setup featuring a model spacecraft on a desert-like terrain, with labeled components including cameras, workstations, and technology from Dell. Two operators are seated at computers with monitors.

by Peter Beck, Field Product Manager Workstations and Rugged at Dell Technologies in Germany. This new approach to filmmaking has revolutionized the industry by blending digital technologies with traditional techniques. It merges virtual environments with live-action filming, allowing studios to project ultra-high-resolution backgrounds onto LED walls where actors can perform in real time. Camera tracking allows virtual backgrounds to react to camera movements in real time, making the scenery appear fully realistic. Studios can simulate buildings or landscapes that exist in the real world, as well as completely fictional worlds such as imaginary planets or fantasy cities. Studios can replicate real-world locations, such as buildings or landscapes, or create entirely fictional worlds, from alien planets to fantastical cities. The benefits of virtual production are extensive. Post-production work is often reduced, and overall costs are lower since crews no longer need to travel to global...


Hello Stranger!

This article is exclusively for Digital Production Subscribers.
If you are already subscribed, please log in below,
if you aren't subscribed, What are you waiting for?

Subscribers get
exclusive access to many articles like the one you just wanted to read,
can directly contact the authors or the newsroom,
can download many cool things from the archives,
support one of the last independent platforms weithout an "Algorithm"
and are granted exclusive bragging rights for being a better person!

Get an overview on what's available here, and access everything on the site!


Subscribe Now!

If you need another reason,
here is a picture of the editorial cat,
which you'll be supporting as well!


The post Workstations for Virtual Production first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
DIGITAL PRODUCTION 193471
From Concept to Play: Streamlining Asset Pipelines with InstaMAT’s Scalable Workflows https://digitalproduction.com/2025/07/23/from-concept-to-play-streamlining-asset-pipelines-with-instamats-scalable-workflows/ Wed, 23 Jul 2025 06:43:19 +0000 https://digitalproduction.com/?p=189166 A gradient background transitioning from purple to blue, featuring a central white logo made of geometric shapes that resemble arrows, symbolizing movement or technology.

Procedural pipelines, scalable and automated asset texturing, physically-based terrain generation, and thousands of generative materials, if that’s your thing, you have to check out InstaMAT!

The post From Concept to Play: Streamlining Asset Pipelines with InstaMAT’s Scalable Workflows first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
A gradient background transitioning from purple to blue, featuring a central white logo made of geometric shapes that resemble arrows, symbolizing movement or technology.

InstaMAT is finally leaving early access with its 2025 release scheduled for 22nd July. This means it’s an excellent time to look at the breakthrough new features slated for this release, from AAA quality terrain generation, artist-friendly curve painting tools, layer references, and a massive improvement to the real-time viewport with raytracing. Packed with over hundreds of additions and improvements, InstaMAT 2025 continues to push the boundaries of what’s possible for generative 3D asset creation for all industries, from games to VFX.

Studios face mounting pressure to deliver expansive, visually stunning games on tighter schedules, all while supporting multiple platforms from high-end PCs to mobile devices. InstaMAT streamlines 3D asset texturing with scalable generative workflows for faster game development.

In the fast-paced world of game development, speed and scalability are paramount. A key bottleneck in this process is the asset pipeline—creating, texturing, and optimizing thousands of 3D assets like environments, props, and characters. Traditional workflows, reliant on manual texturing and iterative adjustments, are too slow to meet modern demands. InstaMAT, developed by Abstract, revolutionizes this process with its procedural, scalable workflows, enabling studios to streamline asset creation and move from concept to play faster than ever.

The Challenge of Modern Asset Pipelines

Game development has evolved dramatically. Open-world titles like Cyberpunk 2077 or Elden Ring feature thousands of unique assets, each requiring detailed textures and platform-specific optimizations. With increasing production complexity AAA production targets become increasingly difficult to meet while staying with the budget. Meanwhile, indie studios and mid-sized teams must compete with AAA quality on leaner budgets. The asset pipeline, which encompasses modeling, texturing, baking, and optimization, often consumes weeks or months, with repetitive tasks draining resources. Scaling a single asset category, such as environmental props, into dozens of variations (e.g., different wear levels or materials) is labor-intensive. Additionally, ensuring assets perform across diverse platforms without sacrificing fidelity requires complex, manual adjustments.

These challenges highlight the need for scalable, automated workflows that maintain creative flexibility while reducing production time. InstaMAT addresses this need with its intuitive layering and node-based systems, designed to scale across asset categories and automate repetitive tasks, empowering teams to deliver high-quality games efficiently.

InstaMAT’s Scalable Workflows

InstaMAT is a breakthrough generative asset design and texturing tool that transforms production pipelines by prioritizing scalability and automation. Its layering and node-based workflows allow developers to create, iterate, and optimize assets at scale, seamlessly integrating with game engines like Unreal Engine and Unity. Unlike traditional tools that require manual edits for each asset variation, InstaMAT enables teams to build flexible, reusable systems that adapt to entire asset categories and automate complex processes.

Layering Projects for Category-Wide Scaling

InstaMAT’s layering system offers a non-destructive, artist-friendly approach to texturing, resembling familiar layer-based image manipulation and compositing tools but tailored for 3D assets. Designers can stack effects such as base materials, wear patterns, decals, or dirt in layers, adjusting parameters to create variations without starting from scratch. This system is ideal for scaling textures across an entire asset category, such as props, vehicles, or architecture.

For example, a single layering project for a set of sci-fi containers can define a base metal texture, procedural scratches, and dirt masks. By tweaking layer parameters (e.g., rust intensity or polish level), artists can generate dozens of unique boxes, machinery and other props from one template. These layers are resolution-independent, ensuring textures scale from high-res consoles to lightweight mobile platforms. The result is a unified, visually consistent asset category produced in a fraction of the time.

Node-Based Projects for Workflow Automation

For technical artists and larger teams, InstaMAT’s node-based system provides powerful automation capabilities. This workflow lets users build custom pipelines by connecting nodes that define material properties, texture maps, or optimization steps. Once created, these graphs can automate repetitive tasks across hundreds or thousands of assets, streamlining production.

For instance, a node graph for environmental rocks can automate the creation of normal maps, ambient occlusion, and diffuse textures, with parameters for size, erosion, and moss coverage. Applying this graph to a library of rock models generates unique variations instantly, each optimized for specific platforms. Nodes can also tap into powerful geometry processing technology from InstaLOD (Abstract’s end-to-end asset optimization and creation software) to automate LOD (level of detail) generation, ensuring assets are game-ready without manual intervention.

In the following video, you see how the layering project for sci-fi containers was used in a node-based workflow, simply by dragging the project into the workspace. This enabled the artist to build on the project to add more procedural logic and functionality, such as performing an automatic retopology, uv-unwrapping and baking. The result is an end-to-end asset pipeline that enables dramatic cost-savings whenever applied.

Integration and Optimization

InstaMAT’s workflows integrate with Abstract’s ecosystem, including InstaLOD for automatic retopology and asset optimization, Polyverse for cloud-based asset management and RSX Engine for real-time collaboration. Textures created in InstaMAT are exported in standard PBR formats (e.g., base color, metalness, roughness), ensuring compatibility with major engines. Projects and materials are instantly be uploaded to Polyverse for cloud-based execution, shared collaboration, and organization between project members, while assets can be optimized and processed in the cloud making it a versatile solution for large and small teams alike.

Use Case: Scaling a Sci-Fi Prop Library with InstaMAT

Imagine a mid-sized studio developing a sci-fi shooter with a sprawling space station environment. The art team needs to create 200 unique props—crates, panels, and terminals—with varied textures (e.g., clean, rusted, battle-damaged) optimized for PC, console, and mobile platforms. Traditionally, this would require weeks of manual texturing and optimization, with artists painstakingly crafting each variation. With InstaMAT, the studio streamlines the process:

Step 1: Layering Project Setup

Artists create a layering project for crates, defining a base metallic texture with layers for paint, scratches, and decals. Parameters like rust intensity and decal placement are adjustable, allowing one project to generate 50 unique crate textures.

Step 2: Scaling Across Categories

The team adapts the layering project for panels and terminals, reusing shared layers (e.g., rust, dirt) while adding category-specific effects like holographic displays. This scales the workflow to cover all 200 props with consistent quality.

Step 3: Pipeline Automation

Leveraging InstaLOD nodes in InstaMAT for automatic retopology, baking and LOD generation; technical artists build a node graph to automate asset processing: low-poly generation, texture map generation (normal, roughness, metalness), texture application, and LOD creation. Using InstaMAT Pipeline to process all 200 props in a single batch, creating both texture LODs (4K/2K/1K for PC, 2K/1K/512 for mobile) and geometry LODs (100%/50%/25% polygon reduction) with platform-specific settings.

Step 4: Project Syncronization

InstaMAT integrates with Polyverse to provide all project members with up-to-date assets and workflows. Using Polyverse Sync, all assets are syncronized across all artists’ and developers’ workstations ensuring consistent versioning and material updates across the entire development pipeline.

Step 5: Collaborative Iteration

Using RSX Engine’s real-time collaboration functionality, the team reviews the props under different light conditions, in difference scenes while tweaks material shaders collaboratively, finalizing the prop library in days.

The outcome is a visually diverse, optimized prop library completed in a fraction of the usual time. The studio delivers a polished game on schedule, with assets that perform seamlessly across platforms, enhancing player immersion.

In the following video, you can see how it all comes together. With RSX Engine, running in a browser, the props are directly downloaded from Polyverse and a scene is assembled collaboratively with other users. The props can be reviewed with full interaction using component logic of the game project. Next, you can review the prop library, using the Polyverse web application.

How Game Developers benefit from InstaMAT

InstaMAT’s scalable workflows address critical industry needs:

  • Efficiency: Layering projects reduce manual work, enabling rapid iteration and scaling across asset categories.
  • Automation: Node-based pipelines eliminate repetitive tasks, freeing artists to focus on creativity.
  • Cross-Platform Support: InstaLOD technology integration ensures assets are optimized for any platform, from VR to mobile.
  • Accessibility: Free Pioneer Licenses make InstaMAT accessible to indie and student developers alike, leveling the playing field with larger AAA studios.

As games grow in scope and complexity, InstaMAT empowers developers to build richer worlds faster, without compromising quality. Its procedural approach not only accelerates production but also fosters creativity, allowing teams to experiment with bold designs and deliver unforgettable player experiences.

Get Started with InstaMAT

Ready to streamline your asset pipeline? Sign up for a free Pioneer License or request an evaluation to explore InstaMAT’s full feature set. Join the Abstract Community forum to share workflows, learn from peers, and stay updated on the latest tools. From concept to play, InstaMAT is your partner in crafting the next generation of games.

About Abstract

Abstract is a deep-tech company pioneering 3D and AI technology. Its products empower game developers, VFX and film, enterprise, XR, and metaverse industries to deliver efficiently with massive cost savings. InstaLOD converts CAD to 3D, optimizes geometry and automates 3D pipelines, InstaMAT introduces generative materials and scalable texturing, Polyverse enhances cloud-based asset management and 3D data processing as a service, while RSX Engine enables real-time collaboration and cloud synchronization when building 3D applications and games. Abstract is driving breakthrough innovation in 3D and AI across industries.

The post From Concept to Play: Streamlining Asset Pipelines with InstaMAT’s Scalable Workflows first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
DIGITAL PRODUCTION A gradient background transitioning from purple to blue, featuring a central white logo made of geometric shapes that resemble arrows, symbolizing movement or technology. 189166
Express Delivery: Adobe, ComLine & the Speed of Content https://digitalproduction.com/2025/04/22/express-delivery-adobe-comline-the-speed-of-content/ Tue, 22 Apr 2025 13:55:49 +0000 https://digitalproduction.com/?p=165911 A woman smiling while participating in a webinar about Adobe Express. The image is bordered with colorful elements and includes text promoting the event on April 29, 2025, in German.

Adobe and ComLine present Adobe Express as a fast-track content tool with Firefly AI and scheduling, live-demoed by Adobe’s Sven Brencher on April 29.

The post Express Delivery: Adobe, ComLine & the Speed of Content first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
A woman smiling while participating in a webinar about Adobe Express. The image is bordered with colorful elements and includes text promoting the event on April 29, 2025, in German.

On April 29, 2025, at 10 AM distributor ComLine hosts a live session on Adobe Express—part product demo, part positioning statement, part classroom for creatives. Presented by Adobe Senior Specialist Sven Brencher and moderated by Andreas Christensen, Business Development Manager Software, the session will give attendees a hands-on look at Adobe’s fast-track content creation tool, now tightly integrated with Adobe Firefly and a native social media planning tool.

Rounding out the event team is Johannes Borm, Director of Marketing at ComLine, who acted as host and stage-setter for what turned out to be both a sales argument and a practical how-to.

Adobe Express: Streamlined Creativity

According to Sven Brencher, Adobe Express is not a stripped-down Photoshop or Premiere. It’s a purpose-built tool aimed at users who want to create content quickly and with minimal friction. Whether that’s for photo, video, web, or print, the software promises ready-to-go templates, brand kits, and drag-and-drop logic, without requiring full-scale Creative Cloud expertise.

Firefly Inside: AI That Thinks in Templates

A highlight of the session is Firefly’s integration into Adobe Express. Firefly is Adobe’s generative AI engine, which now sits at the heart of Express—offering real-time suggestions and layout proposals as users build out content. Brencher will demo how Firefly can propose visuals for social media posts, streamline graphic production, and aid in variation generation—all within the same UI.

The Social Media Planning Tool: No More Spreadsheet Sprints

Integrated directly into Express, Adobe’s social media planning tool lets users create, schedule, and publish posts from a single dashboard. Brencher positions this feature as a native alternative to external schedulers—giving teams the ability to plan entire content calendars without exiting the app.

Additionally, Borm notes that Express forms a compelling component in volume licensing scenarios—particularly for universities and schools, where simplicity, speed, and price-per-seat matter more than having every CC knob and slider.

Education, Simplified

One part of the session focusses on Adobe Express in the educational context. With institutions looking for tools that reduce onboarding time and keep learning curves low, Adobe and ComLine jointly frame Express as a lightweight but capable solution. The emphasis: fewer buttons, faster output, and a good-enough-for-most-results logic.


More information about upcoming ComLine events is here.

The post Express Delivery: Adobe, ComLine & the Speed of Content first appeared on DIGITAL PRODUCTION and was written by Advertorial.

]]>
DIGITAL PRODUCTION 165911