Courses Archives - ThenodeWay https://thenodeway.io/category/courses/ Master Node.js Fri, 15 Mar 2024 14:39:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://thenodeway.io/wp-content/uploads/2024/03/cropped-ThenodeWay-32x32.jpg Courses Archives - ThenodeWay https://thenodeway.io/category/courses/ 32 32 Risky Module Design Approaches https://thenodeway.io/courses/risky-module-design-approaches/ Fri, 05 Jan 2024 17:47:00 +0000 https://thenodeway.io/?p=220 There exist myriad approaches to crafting a JavaScript module. Conventional methodologies such as the singleton and custom type are widely …

The post Risky Module Design Approaches appeared first on ThenodeWay.

]]>
There exist myriad approaches to crafting a JavaScript module. Conventional methodologies such as the singleton and custom type are widely embraced, offering reliable functionalities. However, some design patterns push the boundaries of module construction, prompting scrutiny regarding their appropriateness. While adherence to the former is commonly encouraged, the latter is often met with skepticism. This discourse seeks to delve into this latter category.

Before delving deeper, it’s imperative to explicitly underscore that nearly all concepts elaborated herein should be evaded in production settings. These patterns harbor the potential to sow chaos for you and your team in the future, manifesting as concealed bugs and unforeseen repercussions. Yet, they have their raison d’être, and when applied judiciously (read: with utmost caution), they can address real challenges that more conventional, safer patterns cannot. However, it’s crucial to navigate them with care due to their inherent perilous side effects.

Dynamic Modifications

JavaScript, with its dynamic nature and prototype-based architecture, empowers developers to manipulate objects and classes across entire applications. Thus, when you find yourself wishing that JavaScript strings could autonomously convert to Pig Latin, you can implement something akin to:

String.prototype.pigLatin = function() { /* … */ }
'Is this actually a good idea?'.pigLatin() // 'Is-ay is-thay actually-ay an ood-gay idea-ay?'

While appending new methods is relatively straightforward, altering existing ones necessitates more finesse. Although you can simply override them, preserving the original functionality requires prior preservation. For instance, when augmenting data to every template rendered in an Express application:

// Preserve the original render function for subsequent use
res._render = res.render;
// Enhance the render function to preprocess arguments prior to rendering
res.render = function(view, options, callback) {
options.global = { /* … */ };
this._render(view, options, callback);
}

This practice, known as monkey patching, is generally frowned upon due to its propensity to contaminate the application’s shared environment. It risks clashes with other patches and can be a nightmare to debug even when functioning correctly. Although a potent workaround, its application is thankfully limited.

However, exigencies may necessitate desperate measures, warranting the use of a monkey patch. In such scenarios, encapsulating the patch within a separate module helps quarantine the hack, isolating it from the rest of the application. Centralizing monkey patches facilitates easier debugging if needed and underscores the necessity for rigorous environmental assertions. Verifying the absence of conflicts and ensuring adherence to expectations before patch application can preempt arduous debugging endeavors down the line.

Additionally, consider exporting the monkey patch as a singleton, featuring a solitary apply() method for executing the code. Explicitly applying the patch (rather than implicitly through module loading) elucidates the module’s purpose and enables parameter passing, potentially pivotal depending on the use case:

// some-monkey-patch/index.js
module.exports = {
apply: function() {
/* Validate environment/arguments & apply patch */
}
}
// Later…
require('some-monkey-patch').apply();

Polyfills

Polyfills predominantly surface in client-side development, addressing disparate feature support across browsers. Rather than accommodating the lowest common denominator (a nod to IE), polyfills empower developers to augment old browsers with modern features, fostering standardization across platforms.

Server-side developers might assume immunity to this dilemma. However, Node.js’s protracted v0.12 development cycle implies that even Node.js aficionados might encounter features unavailable in their current environment until a subsequent release. For instance, although async listeners debuted in v0.11.9, stable adoption awaited v0.12.0.

Alternatively, one could opt for an async listener polyfill:

// Load polyfill if native support is absent
if (!process.addAsyncListener) require('async-listener');

Although fundamentally a form of monkey patching, polyfills are typically safer to apply. Limited to implementing predefined features, polyfills are bolstered by existing specifications, albeit necessitating meticulous scrutiny akin to traditional monkey patches. Understanding the code being integrated, anticipating potential collisions (since specs can evolve), and affirming the environment’s conformity prior to patch application are pivotal.

JSON Modules

JSON emerges as the de facto data format in Node.js, with native support streamlining interaction with static data files as though they were JavaScript modules. Notably, the original http-status-codes-json module comprised a static JSON file, effectively morphing into an interactive HTTP status code compendium, courtesy of Node’s JSON support:

// http_status_codes.json
{
"100": "Continue",
"200": "OK",
/* … */
}

This functionality packs a punch, yet exercise caution before refactoring code. Synchronous module loading suspends other operations until completion, and the parsed result persists in the module cache indefinitely. Unless necessitating direct module interaction, leverage fs.readFile() and/or JSON.parse() to avert performance degradation and complexity escalation.

Compile-to-JS Modules

While Node readily accommodates JSON, attempting to require() any other file type triggers an error. However, diligent exploration reveals Node’s flexibility in supporting diverse file types, contingent upon developer-provided parsers.

The mechanism operates as follows: Node houses an array of internal “file extensions,” tasked with loading, parsing, and exporting valid representations of respective file types. For instance, the native JSON extension reads files via fs.readFileSync(), parses them using JSON.parse(), and binds the resultant object to module.exports. While these parsers remain intrinsic to Node’s module type, they are externally accessible via the require() function.

CoffeeScript stands out as a prominent compile-to-JS language, albeit necessitating manual compilation post-editing. Leveraging the aforementioned technique, enthusiasts could seamlessly integrate CoffeeScript support into Node.js, automating the compilation process:

module.exports = {
apply: function() {
// Integrate CoffeeScript extension into Node.js
require.extensions['coffee'] = function coffeescriptLoader(module, filename) {
// Read contents from the '.coffee' file
var fileContent = fs.readFileSync(filename, 'utf8');
// Compile into JavaScript for V8 interpretation
var jsContent = coffeescript.compile(fileContent);
// Pass compiled contents as a standard JavaScript module
module._compile(jsContent, filename);
};
}
}
// Later…
require('require-coffee').apply();

Note: This feature was deprecated upon realizing that preprocessing code into JS or JSON before runtime typically yields superior outcomes. Parsing directly during runtime can obscure bugs, as the actual JS/JSON generated remains invisible.

MP3 Modules?

While CoffeeScript seamlessly integrates with JavaScript, Node.js delegates file representation to developers, permitting require() for any file type. Illustratively, let’s explore this with an entirely distinct file type like MP3.

Simply retrieving and returning file contents as an MP3 module is too simplistic. Instead, augmenting this functionality to encompass song metadata generation via the audio-metadata module adds depth:

var audioMetaData = require('audio-metadata');
// Custom type representing the MP3 file and its metadata
function MP3(file) {
// Attach file contents
this.content = file;
// Process and attach audio ID3 tags
this.metadata = audioMetaData.id3v2(fileContent);
}
// Integrate the MP3 extension

The post Risky Module Design Approaches appeared first on ThenodeWay.

]]>
How require() Functions in Reality https://thenodeway.io/courses/how-require-functions-in-reality/ Sat, 16 Sep 2023 15:47:00 +0000 https://thenodeway.io/?p=205 Most Node.js developers can explain the purpose of the require() function, but few truly understand its inner workings. Despite its …

The post How require() Functions in Reality appeared first on ThenodeWay.

]]>
Most Node.js developers can explain the purpose of the require() function, but few truly understand its inner workings. Despite its frequent use for loading libraries and modules, its underlying behavior often remains a mystery.

Intrigued by this, I delved into the Node core to uncover the mechanics behind require(). Instead of encountering a simple function, I stumbled upon the core of Node’s module system: module.js. This file houses a powerful yet largely unnoticed core module that governs the loading, compiling, and caching of every utilized file. Surprisingly, require() merely scratches the surface of this intricate system.

module.js:

function Module(id, parent) {
this.id = id;
this.exports = {};
this.parent = parent;
// …
}

The Module type within module.js serves two primary purposes in Node.js. Firstly, it provides a foundation for all Node.js modules, ensuring consistency and enabling the attachment of properties to module.exports. Secondly, it manages Node’s module loading mechanism.

The standalone require() function, commonly used, is actually an abstraction over module.require(), which, in turn, wraps around Module._load(). This load() function handles the actual loading of each file, marking the starting point of our exploration.

Module._load:

Module._load = function(request, parent, isMain) {
// 1. Check Module._cache for the cached module.
// 2. Create a new Module instance if cache is empty.
// 3. Save it to the cache.
// 4. Call module.load() with the given filename.
// This invokes module.compile() after reading the file contents.
// 5. If there's an error loading/parsing the file,
// remove the faulty module from the cache.
// 6. Return module.exports.
};

Module._load is responsible for loading new modules and managing the module cache, optimizing performance by reducing redundant file reads and facilitating the sharing of module instances.

If a module isn’t cached, Module._load creates a new base module for that file, instructing it to read the file’s contents before passing them to module._compile().

The magic intensifies with module._compile(), where a special standalone require() function is generated for the module, encapsulating the loaded source code within a new function. This function creates a distinct functional scope for the module, preventing environmental pollution.

In conclusion, our journey through the require() code path has unveiled the intricate mechanisms that power it. Additionally, the revelation that require('module') is possible epitomizes the versatility of Node’s module system.

For those eager to delve deeper, exploring the source code of module.js offers further enlightenment. And for the intrepid few who unravel the mystery of ‘NODE_MODULE_CONTEXTS,’ accolades await.

The post How require() Functions in Reality appeared first on ThenodeWay.

]]>
Creating Factory Layouts https://thenodeway.io/courses/creating-factory-layouts/ Sat, 25 Mar 2023 04:33:00 +0000 https://thenodeway.io/?p=216 In a previous discussion, we introduced the Custom Type Module Pattern and its utility in crafting bespoke objects within your …

The post Creating Factory Layouts appeared first on ThenodeWay.

]]>
In a previous discussion, we introduced the Custom Type Module Pattern and its utility in crafting bespoke objects within your application. While these custom types hold significance in JavaScript development, directly crafting them can swiftly lead to clutter. In this piece, I’ll introduce a novel approach, the Factory pattern, which significantly streamlines the process of working with custom types.

So, what exactly is a Factory?

To grasp the essence of a factory, it’s crucial to understand the problem it aims to address. Directly creating a new custom object necessitates invoking the constructor with the ‘new’ keyword and any requisite arguments. This grants absolute control over creation, a powerful capability indeed, yet not always pragmatic.

Let’s delve into the illustration using solely Custom Types:

var Widget = require('./lib/widget');
// Somewhere in your application…
var redWidget = new Widget(42, true);
redWidget.paintPartA('red');
redWidget.paintPartB('red');
redWidget.paintPartC('red');
// Elsewhere in your application…
var blueWidget = new Widget(42, true);
blueWidget.paintPartA('blue');
blueWidget.paintPartB('blue');
blueWidget.paintPartC('blue');
// And yet again somewhere else…
var greenWidget = new Widget(42, true);
greenWidget.paintPartA('green');
greenWidget.paintPartB('green');
greenWidget.paintPartC('green');

Handling a complex constructor can be excessive in simpler cases or when repeatedly passing the same arguments to constructors becomes a norm. Both scenarios are, at best, inconvenient and, at worst, entirely unscalable. Copypasta in code is never a pleasant sight.

Enter the factory to the rescue. A factory’s role is to create objects so that you don’t have to, offering several advantages. Whether you’re developing a public module for npm or internally within your team, integrating a factory interface can enhance the usability of your module. This is particularly true if additional steps are required post-constructor invocation. A factory consolidates and standardizes the custom creation process for both developers and users, outlining precisely how new objects should be crafted.

With the aid of a factory, the above example transforms into:

var widgetFactory = require('./lib/widget-factory');
var redWidget = widgetFactory.getRedWidget();
var blueWidget = widgetFactory.getBlueWidget();
var greenWidget = widgetFactory.getGreenWidget();

If you encounter struggles with intricate object creation spread across various parts of your application, this pattern could be the solution. By centralizing this logic in a singular location, you prevent its proliferation throughout your codebase where it doesn’t belong.

Separating the Product

Before constructing a factory, ensure that any custom types you intend to utilize are already defined as separate modules. While it might be tempting to house both the factory and the product (the object returned by a factory) within the same file, adhering to the principle of small, single-purpose modules is crucial. Segregating these modules diminishes confusion regarding their distinct responsibilities. In practice, this separation enhances testing coverage and code reusability.

The Factory Design Pattern

The Factory pattern can be built upon either singleton or custom type patterns. If your entire application can share a single factory object, constructing it as a singleton is often the ideal choice. A singleton can be effortlessly required anywhere and necessitates no intricate construction or distribution. However, it’s imperative to remember that your state is confined to the single module.

var Widget = require('./widget');
module.exports = {
getRedWidget: function getRedWidget() {
var widget = new Widget(42, true);
widget.paintPartA('red');
widget.paintPartB('red');
widget.paintPartC('red');
return widget;
},
getBlueWidget: function getBlueWidget() {
// …
}
}

Nonetheless, employing the same factory across your application may not always suffice. Occasionally, a factory demands its setup and customization. In such scenarios, crafting your factory utilizing the custom type pattern is more appropriate. This approach enables your factory to offer a basic level of flexibility with each instance while encapsulating product customization.

var Widget = require('./lib/widget');
var WidgetFactory = module.exports = function WidgetFactory(options) {
this.cogs = options.cogs;
this.bool = options.bool;
}
WidgetFactory.prototype.getRedWidget = function getRedWidget() {
var widget = new Widget(this.cogs, this.bool);
widget.paintPartA('red');
widget.paintPartB('red');
widget.paintPartC('red');
return widget;
};
WidgetFactory.prototype.getBlueWidget = function getBlueWidget() {
// …
};

Abstract Factories

Once familiar with factories, you can explore innovative ways to leverage them. The Abstract Factory pattern, for instance, elevates flexibility by taking the concept a step further. If you possess multiple factories, each crafting different products from a common parent type, establishing a shared interface among them is beneficial. This facilitates seamless interchangeability between factories without necessitating modifications across your codebase.

function someMiddleware(req, res, next) {
if(/* user is logged in */) {
req.userFactory = new LoggedInUserFactory(req.session.userID);
} else {
req.userFactory = new LoggedOutUserFactory();
}
next(); // This will call nextMiddleware() with the same req/res objects
}
function nextMiddleware(req, res, next) {
var user = req.userFactory.getUser();
var credentials = req.userFactory.getCredentials();
res.send('Welcome back, ' + user.fullName);
}

Here, you have two distinct factories for logged-in and logged-out users. They lack a shared interface (createUser() vs. createAnonymousUser()), necessitating constant checks to determine the type of factory in use. This results in unnecessary checks throughout the application, when ideally, only one is needed.

By consolidating each interface into a common framework, you implement an abstract factory. Subsequently, each factory can be utilized without concern for the type of user being created, streamlining decision-making within your application.

Conclusion

A robust custom type interface should possess the flexibility to accommodate all reasonable use cases. However, this flexibility may come at a price. As the array of supported options expands, so does the interface and requisite setup, potentially introducing complexity. Moreover, creation intricacies may permeate your application, leading to duplicated code blocks for each new object. Factories offer a clean solution, restoring order to your application by simplifying and streamlining complex and repetitive customizations, making them more maintainable in the process.

The post Creating Factory Layouts appeared first on ThenodeWay.

]]>
What does The Node Way entail? https://thenodeway.io/courses/what-does-the-node-way-entail/ Fri, 05 Aug 2022 14:33:00 +0000 https://thenodeway.io/?p=200 Node.js has always been relatively easy to understand but mastering it can be quite challenging. While there are design patterns …

The post What does The Node Way entail? appeared first on ThenodeWay.

]]>
Node.js has always been relatively easy to understand but mastering it can be quite challenging. While there are design patterns and best practices to avoid common mistakes and build more robust applications, these principles have not always been well-documented. Instead, developers have often had to uncover them through trial and error.

In response to this frustration, I embarked on a journey to uncover the philosophy behind Node.js and what people truly mean when they refer to “The Node Way.” Is it merely about utilizing small modules and asynchronous code, or is there something deeper?

My exploration led to the creation of thenodeway.io, a platform containing articles and chapters aimed at understanding the core principles of Node.js and using that understanding to create exceptional projects.

But before delving into the details, let’s address the fundamental question: What exactly is The Node Way?

Structure Node.js modules serve as the fundamental building blocks of applications, akin to atoms or DNA. They promote modularity while offering flexibility to fit into various contexts. While the choice of how to utilize these modules is yours, the following guidelines can steer you in the right direction:

  1. Build small modules: The Node.js philosophy emphasizes constructing modules with a single purpose, following the Unix tradition of composing complex systems from smaller components.
  2. …But exercise restraint: While modularity is powerful, it’s essential not to overdo it. Separating every function into its own module might not always simplify your codebase; simplicity should be the ultimate goal.
  3. Embrace complexity through composition: Instead of directly inheriting functionality from other modules, consider using objects internally, focusing on interfaces rather than implementation details.

Async Node.js liberates JavaScript from the confines of the browser, enabling the creation of applications, servers, and even operating systems. Contrary to popular belief, JavaScript’s expansion beyond the browser was not its original intent. Node.js’s creator, Ryan Dahl, aimed to address frustrations with existing web frameworks by envisioning a server capable of handling multiple requests concurrently, facilitated by V8’s asynchronous support.

  1. Harness I/O, avoid heavy computation: Node.js leverages asynchronous I/O, enabling it to efficiently manage multiple requests simultaneously. Offloading intensive tasks to background workers or APIs ensures optimal performance.
  2. Prioritize clean asynchronous code: Given the pivotal role of asynchronous operations, investing in readable asynchronous code is crucial to maintaining a healthy codebase and avoiding the pitfalls of nested callbacks.
  3. Implement standard callback patterns: Adhering to error-first callback patterns facilitates interoperability between modules and enhances code readability.
  4. Utilize control flow libraries: Libraries like async simplify asynchronous programming by offering utilities for managing callback behavior, reducing the complexity of asynchronous code.

Community The Node.js community, fostered by tools like NPM, plays a pivotal role in the ecosystem’s growth and evolution.

  1. Leverage the ecosystem: With over 200,000 modules available on NPM, there’s likely a package for almost any functionality you require. Before reinventing the wheel, explore existing modules.
  2. Choose the right tool for each task: NPM’s modular nature allows you to select the most suitable packages for your project, promoting flexibility and efficiency.
  3. Write modules with readability in mind: When contributing to the community, prioritize clarity and documentation to facilitate ease of use for others.
  4. Embrace and contribute to the community: The Node.js community thrives on collaboration and mutual support. By engaging with the community, you contribute to its growth and success.

In summary, The Node Way encompasses principles of modularity, asynchronous programming, and community engagement, all aimed at fostering the creation of robust and scalable Node.js applications.

The post What does The Node Way entail? appeared first on ThenodeWay.

]]>
Correspondence with Understanding Error Callbacks https://thenodeway.io/courses/correspondence-with-understanding-error-callbacks-2/ Sun, 17 Apr 2022 14:26:00 +0000 https://thenodeway.io/?p=194 If the V8 Engine from Google is the core engine of your Node.js application, then callbacks act as its circulatory …

The post Correspondence with Understanding Error Callbacks appeared first on ThenodeWay.

]]>
If the V8 Engine from Google is the core engine of your Node.js application, then callbacks act as its circulatory system. They facilitate a smooth, non-blocking flow of asynchronous control within modules and applications. However, for callbacks to function effectively at scale, a standardized and reliable protocol is essential. The “error-first” callback, also referred to as an “errorback”, “errback”, or “node-style callback”, was introduced to address this requirement and has since become the norm for Node.js callbacks. This article aims to elucidate this pattern, its recommended practices, and what makes it so potent.

Why Establish a Standard?

The extensive usage of callbacks in Node.js harks back to a programming style predating JavaScript itself. Known as Continuation-Passing Style (CPS), this approach involves passing a “continuation function” (i.e., callback) as an argument to be executed once the remaining code has run. This mechanism allows different functions to transfer control asynchronously across an application.

Given Node.js’s reliance on asynchronous code for performance, having a dependable callback pattern is paramount. Without it, developers would grapple with maintaining diverse signatures and styles across modules. The error-first pattern was integrated into Node core to tackle this issue head-on and has since become the prevailing standard. While each use case may necessitate different responses and requirements, the error-first pattern can accommodate them all.

Defining an Error-First Callback

There are essentially two rules for defining an error-first callback:

The first argument of the callback reserves space for an error object. If an error occurs, it will be conveyed through this initial “err” argument.
The second argument of the callback is designated for any successful response data. In the absence of an error, “err” will be set to null, and the successful data will be provided in the second argument.
That’s the crux of it. It’s straightforward, isn’t it? Naturally, there are significant best practices to consider as well. But before delving into those, let’s illustrate this with a practical example using the basic method fs.readFile():

fs.readFile('/foo.txt', function(err, data) {
// TODO: Error Handling Still Needed!
console.log(data);
});

The fs.readFile() function reads from a specified file path and invokes your callback upon completion. If everything proceeds smoothly, the file contents are passed in the “data” argument. However, if an issue arises (e.g., the file doesn’t exist, permission is denied), the “err” argument will be populated with an error object detailing the problem.

It’s incumbent upon you, as the creator of the callback, to handle this error appropriately. You might choose to terminate the entire application by throwing an error, or you could propagate the error to the next callback in an asynchronous flow. The decision hinges on both the context and the desired behavior.

fs.readFile('/foo.txt', function(err, data) {
// If an error occurs, handle it (throw, propagate, etc)
if(err) {
console.log('Unknown Error');
return;
}
// Otherwise, log the file contents
console.log(data);
});

Err-ception: Propagating Your Errors

By passing errors to a callback, a function no longer needs to make assumptions about how those errors should be handled. For example, readFile() itself lacks insight into how critical a file read error might be to your specific application. It could be an anticipated occurrence or a catastrophic event. Instead of making such judgments internally, readFile() delegates the handling back to you.

Consistency with this pattern enables errors to be propagated upward as many times as necessary. Each callback can opt to ignore, address, or pass on the error based on the available information and context at its level.

if(err) {
// Handle "Not Found" by responding with a custom error page
if(err.fileNotFound) {
return this.sendErrorMessage('File Does not Exist');
}
// Ignore "No Permission" errors, as this controller recognizes them as insignificant
// Propagate all other errors (Express will catch them)
if(!err.noPermission) {
return next(err);
}
}

Slow Your Roll, Control Your Flow

Equipped with a robust callback protocol, you’re no longer confined to executing one callback at a time. Callbacks can be invoked in parallel, in a queue, sequentially, or any combination you envision. Whether you intend to read ten different files or make a hundred API calls, there’s no need to tackle them one by one.

The async library proves invaluable for advanced callback utilization. And thanks to the error-first callback pattern, integrating it is remarkably straightforward.

// Example from the caolan/async README
async.parallel({
one: function(callback){
setTimeout(function(){
callback(null, 1);
}, 200);
},
two: function(callback){
setTimeout(function(){
callback(null, 2);
}, 100);
}
},
function(err, results) {
// results: {one: 1, two: 2}
});

Bringing it all Together

To witness these concepts in action, explore additional examples on GitHub. Of course, you’re free to disregard this callback paradigm altogether and delve into the realm of promises… but that’s a discussion for another time.

The post Correspondence with Understanding Error Callbacks appeared first on ThenodeWay.

]]>
Team ThenodeWay https://thenodeway.io/courses/team-thenodeway/ https://thenodeway.io/courses/team-thenodeway/#respond Fri, 25 Feb 2022 18:43:00 +0000 https://thenodeway.io/?p=246 Our Teachers Our instructors are experienced Node.js application development professionals who have extensive experience with the technology and deep knowledge …

The post Team ThenodeWay appeared first on ThenodeWay.

]]>
Our Teachers

Our instructors are experienced Node.js application development professionals who have extensive experience with the technology and deep knowledge in the relevant fields. We strive to attract leading experts among developers, engineers, and software architects to ensure that our students have access to the most up-to-date and high-quality information.

Every instructor at our site has not only a deep knowledge of Node.js, but also an understanding of how that knowledge is applied in practice. They are willing to share their own experiences and best practices with our students, helping them develop as Node.js application developers.

We are proud of our team of instructors and are confident in their ability to inspire and help students achieve their goals in Node.js application development.

Michael McCranie

A software architect with more than 15 years of experience. He has extensive knowledge in developing scalable and secure applications based on Node.js and teaches courses on this topic at technical universities.

Albert Durbin

DevOps engineer with experience in developing and maintaining highly loaded applications in Node.js. He has deep knowledge in deploying and scaling Node.js applications on cloud platforms.

Harlan Rodriguez

He is an expert in microservice architecture and API development on Node.js. He has many years of experience in large companies, where he actively used Node.js to create distributed systems.

Roy Olson

Experienced software developer with 10 years of experience. Specializes in developing high-load web applications on Node.js and is actively involved in the developer community.

Paul Newman

Full stack developer with a narrow specialization in backend development on Node.js. Worked in large technology companies and actively contributes to open source projects.

Wanda Hall

A software architect with more than 15 years of experience. He has extensive knowledge in developing scalable and secure applications based on Node.js.

The post Team ThenodeWay appeared first on ThenodeWay.

]]>
https://thenodeway.io/courses/team-thenodeway/feed/ 0
Mandatory Testing https://thenodeway.io/courses/mandatory-testing/ Fri, 11 Feb 2022 14:54:00 +0000 https://thenodeway.io/?p=209 Crafting effective testing setups can be challenging regardless of the programming language you’re using. JavaScript’s adaptability can empower your tests …

The post Mandatory Testing appeared first on ThenodeWay.

]]>
Crafting effective testing setups can be challenging regardless of the programming language you’re using. JavaScript’s adaptability can empower your tests with extensive access, but without caution, this same flexibility might lead to frustration later on. For instance, how do you go about testing API callbacks or handling ‘require’ statements? Without a well-structured setup, the ongoing debate over the viability of Test-Driven Development (TDD) becomes somewhat inconsequential.

This article aims to elucidate the necessary tools for proficient testing in Node.js. Together, these tools constitute a fundamental testing suite capable of addressing the needs of nearly any project. The setup prioritizes simplicity over complexity and advanced functionalities. If this approach seems counterintuitive, read on for further explanation.

Introduction: Opting for Clarity over Complexity

Before delving into the tools, it’s crucial to underscore the primary purpose of writing tests: assurance. Tests serve to instill confidence that all components operate as intended. In the event of a malfunction, tests should swiftly pinpoint the issue. Every line of code within a test file should serve this overarching objective.

However, contemporary frameworks often prioritize sophistication. While advanced features and technologies undoubtedly benefit developers, they can inadvertently obfuscate clarity. Although such frameworks may facilitate faster test execution or foster code reusability, do they truly enhance confidence in what’s being tested? Remember: the merit of tests lies in their clarity, not their cleverness.

Test transparency should reign supreme. If a framework prioritizes efficiency or ingenuity at the expense of clarity, it ultimately hampers your testing efforts.

The Indispensable Toolkit

Now, let’s introduce the indispensable toolkit for Node.js testing. Following the principles of The Node Way, which advocates for smaller, specialized tools over monolithic testing frameworks, this toolkit comprises several discrete tools, each excelling in a specific area:

  1. Testing Framework: (e.g., Mocha, Vows, Intern);
  2. Assertion Library: (e.g., Chai, Assert);
  3. Stubs: (e.g., Sinon);
  4. Module Control: (e.g., Mockery, Rewire).

Testing Framework

The cornerstone of your testing toolkit is a robust testing framework. Such a framework provides a structured foundation for your tests. Numerous options exist in this realm, each offering distinct features and design philosophies. Regardless of your choice, prioritize frameworks that facilitate the creation of clear, maintainable tests.

In the context of Node.js, Mocha stands out as the preeminent choice. With a long-standing presence in the ecosystem, Mocha boasts robust testing capabilities and extensive customization options. Despite lacking flashy features, Mocha’s setup/teardown pattern promotes the creation of explicit, comprehensible, and easy-to-follow tests.

describe('yourModuleName', function() {
before(function(){
// One-time setup before all tests in the suite
});
beforeEach(function(){
// Setup before each test
});
it('does x when y', function(){
// Test implementation
});
after(function() {
// Teardown after all tests have completed
});
});

Assertion Library

Once you’ve established your testing framework, an assertion library becomes indispensable for writing tests effectively. Assertion libraries enable you to verify expected outcomes and raise errors when assertions fail. Various libraries and syntax styles are available, catering to different testing paradigms. Whether you prefer Test-Driven Development (TDD) or Behavior-Driven Development (BDD), select a library that aligns with your preferences. Chai is particularly versatile, supporting multiple assertion styles, but Node.js also provides a basic assertion library out of the box.

var life = 40 + 2;
expect(life).to.equal(42); // BDD style assertion
assert.equal(life, 42); // TDD style assertion

Stubs

Despite the utility of assertions, they have limitations, especially when testing complex functions. Stubs allow you to manipulate the testing environment, influencing code behavior under specific conditions. Sinon, a prominent stubbing library, facilitates this process by enabling you to define expected behaviors for functions and API calls.

var callback = sinon.stub();
callback.withArgs(42).returns(1);
callback.withArgs(1).throws("TypeError");
callback(); // No return value, no exception
callback(42); // Returns 1
callback(1); // Throws TypeError

Sinon also offers additional functionalities, such as spies for monitoring function calls and mocks for setting behavior expectations.

Module Control

Before commencing with testing, you must address a significant hurdle: the ‘require’ statement. Since ‘require’ calls typically occur internally, accessing, stubbing, or asserting external modules from tests poses a challenge. To overcome this obstacle, you need control over ‘require’.

Mockery and Rewire offer distinct approaches to module control. Mockery allows you to manipulate the internal module cache and replace modules with custom objects. Remember to disable and deregister mocks after test execution.

before(function() {
mockery.enable();
// Allow specific modules
mockery.registerAllowable('async');
// Control others
mockery.registerMock('../some-other-module', stubbedModule);
});
after(function() {
mockery.deregisterAll();
mockery.disable();
});

Rewire provides even more advanced capabilities, enabling access to and modification of private variables within modules. Exercise caution with Rewire, as excessive manipulation may deviate too far from the original module behavior.

Bringing it All Together

To witness these tools in action, refer to a functional example on GitHub. While certain libraries were highlighted above, numerous alternatives exist within each toolkit category.

By embracing this comprehensive toolkit, you can establish a robust testing infrastructure tailored to your project’s needs. Remember, prioritize clarity and simplicity in your testing endeavors, as these qualities are paramount for effective testing practices.

The post Mandatory Testing appeared first on ThenodeWay.

]]>