A data-oriented shell

Parallelism for Heavy Workloads

Nushell has experimental support for parallel execution in pipelines. Specifically, it provides parallel versions of some commands, such as par-each, which runs the body for each element in a list concurrently across multiple threads [3]. If you have a pipeline where each item can be processed independently (e.g., processing a list of files, pinging multiple servers, etc.), consider using par-each instead of each. When used appropriately, parallelism can significantly improve performance on multicore systems for large tasks.

Memory Considerations

Because Nushell holds structured data in memory, be mindful of extremely large data. If you try to open giant.json that's, say, 500MB, Nushell will need to load and represent that structure in memory that could be a few times larger. In scenarios where memory is a concern, consider processing data in chunks (if possible) or using streaming tools in combination with Nushell. For instance, you could pipe data through jq to pre-filter large JSON files before Nushell ingests it or use tail -f style streaming for logs and have Nushell process incrementally. Always test with smaller samples and monitor resource usage.

Concurrency and Background Tasks

Nushell can also run commands in the background (there's an & operator for background tasks and a way to check on these tasks [4]). Offloading long-running tasks to the background can keep your shell free for other work. However, as of writing, background task management in Nushell is basic, so heavy parallel background jobs might be better handled by external orchestrators or using par-each.

In general, Nushell's performance for everyday tasks (listing directories, parsing moderate JSON, etc.) is very good. When pushing the boundaries (very large data or many operations), remember it's essentially a small data engine – use the tools it provides (like parallel commands or the dataframe plugin) to help Nushell out. And don't hesitate to combine Nushell with other optimized tools for specific steps if needed (e.g., use rg, ripgrep, for super fast text searching if a plain-text search is what you need, and then feed results into Nushell). The goal is to use Nushell where it adds value and not force it into scenarios it isn't optimized for.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • FOSSPicks

    After building a 3D printer last month, Graham's home is now filled with cup holders, phone holders, cable holders, and tiny PLA boats. This month, Graham explores Zrythm, Mumble 1.3, NoteKit, Kirogi, monolith, pastel, Nu Shell, PacVim, Stunt Car Racer Remake, and more!

  • Spider, Spider

    Scrapy is an open source framework written in Python that lets you build your own crawlers with minimal effort for professional results.

  • Advanced Shell Scripting

    Shell scripting is a versatile tool for managing and automating the modern IT infrastructure. This article reaches beyond the basics with some advanced techniques for tackling real-world challenges.

  • Bash vs. Vista PowerShell

    Microsoft’s new PowerShell relies on .NET framework libraries and thus has access to a treasure trove of functions and objects. How does PowerShell measure up to traditional shells like Bash?

  • Tutorial – Shell Scripting

    You do not need to learn low-level programming languages to become a real Linux power user. Shell scripting is all you need.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News