Tag Archives: node.js

React Redux – Actions & Reducers

Having immersed in coding JavaScript exclusively using Node.js and React over the past couple of months, I’ve come to appreciate the versatility and robustness the “combo” has to offer. I’ve always liked the minimalist design of Node.js, and would always consider it a top candidate whenever building an app/API server is needed. Besides ordinary app servers, Node has also been picked on a few occasions to serve as servers for decentralized applications (dApps) that involve smart contract deployments to public blockchains. In fact, Node and React are also a popular tech stack for dApp frameworks such as Scaffold-ETH.

React & React Redux

React is relatively new to me, though it’s rather easy to pick up the basics from React‘s official site. And many tutorials out there showcase how to build applications using React along with the feature-rich toolset within the React ecosystem. For instance, this tutorial code repo offers helpful insight for developing a React application with basic CRUD.

React can be complemented with Redux that allows a central store for state update in the UI components. Contrary to the local state maintained within an React component (oftentimes used for handling interactive state changes to input form elements), the central store can be shared across multiple components for state update. That’s a key feature useful for the R&D project at hand.

Rather than just providing a plain global state repository for direct access, the React store is by design “decoupled” from the components. React Redux allows custom programmatic actions to be structured by user-defined action types. To dispatch an action, a component would invoke a dispatch() function which is the only mechanism that triggers a state change.

React actions & reducers

In general, a React action which is oftentimes dispatched in response to an UI event (e.g. a click on a button) mainly does two things:

  1. It carries out the defined action which is oftentimes an asynchronous function that invokes a user-defined React service which, for instance, might be a client HTTP call to a Node.js server.
  2. It connects with the Redux store and gets funneled into a reduction process. The reduction is performed thru a user-defined reducer which is typically a state aggregation of the corresponding action type.

An action might look something like below:

const myAction = () => async (dispatch) => {
  try {
    const res = await myService.someFunction();
    dispatch({
      type: someActionType,
      payload: res.data,
    });
  } catch (err) {
    ...
  }
};

whereas a reducer generally has the following function signature:

const myReducer = (currState = prevState, action) => {
  const { type, payload } = action;
  switch (type) {
    case someActionType:
      return someFormOfPayload;
    case anotherActionType:
      return anotherFormOfPayload;
    ...
    default:
      return currState;
  }
};

Example of a React action

${react-project-root}/src/actions/user.js

import {
  CREATE_USER,
  RETRIEVE_USERS,
  UPDATE_USER,
  DELETE_USER
} from "./types";
import UserDataService from "../services/user.service";
export const createUser = (username, password, email, firstName, lastName) => async (dispatch) => {
  try {
    const res = await UserDataService.create({ username, password, email, firstName, lastName });
    dispatch({
      type: CREATE_USER,
      payload: res.data,
    });
    return Promise.resolve(res.data);
  } catch (err) {
    return Promise.reject(err);
  }
};
export const findUsersByEmail = (email) => async (dispatch) => {
  try {
    const res = await UserDataService.findByEmail(email);
    dispatch({
      type: RETRIEVE_USERS,
      payload: res.data,
    });
  } catch (err) {
    console.error(err);
  }
};
export const updateUser = (id, data) => async (dispatch) => {
  try {
    const res = await UserDataService.update(id, data);
    dispatch({
      type: UPDATE_USER,
      payload: data,
    });
    return Promise.resolve(res.data);
  } catch (err) {
    return Promise.reject(err);
  }
};
export const deleteUser = (id) => async (dispatch) => {
  try {
    await UserDataService.delete(id);
    dispatch({
      type: DELETE_USER,
      payload: { id },
    });
  } catch (err) {
    console.error(err);
  }
};

Example of a React reducer

${react-project-root}/src/reducers/users.js

import {
  CREATE_USER,
  RETRIEVE_USERS,
  UPDATE_USER
  DELETE_USER,
} from "../actions/types";
const initState = [];
function userReducer(users = initState, action) {
  const { type, payload } = action;
  switch (type) {
    case CREATE_USER:
      return [...users, payload];
    case RETRIEVE_USERS:
      return payload;
    case UPDATE_USER:
      return users.map((user) => {
        if (user.id === payload.id) {
          return {
            ...user,
            ...payload,
          };
        } else {
          return user;
        }
      });
    case DELETE_USER:
      return users.filter(({ id }) => id !== payload.id);
    default:
      return users;
  }
};
export default userReducer;

React components

Using React Hooks which are built-in functions, the UI-centric React components harness powerful features related to handling states, programmatic properties, parametric attributes, and more.

To dispatch an action, the useDispatch hook for React Redux can be used that might look like below:

import { useDispatch, useSelector } from "react-redux";
...
  const dispatch = useDispatch();
  ...
    dispatch(myAction(someRecord.id, someRecord))  // Corresponding service returns a promise
      .then((response) => {
        setMessage("myAction successful!");
        ...
      })
      .catch(err => {
        ...
      });
  ...

And to retrieve the state of a certain item from the Redux store, the userSelector hook allow one to use a selector function to extract the target item as follows:

  const myRecords = useSelector(state => state.myRecords);  // Reducer myRecords.js

Example of a React component

${react-project-root}/src/components/UserList.js

import React, { useState, useEffect } from "react";
import { useDispatch, useSelector } from "react-redux";
import { Link } from "react-router-dom";
import { retrieveUsers, findUsersByEmail } from "../actions/user";
const UserList = () => {
  const dispatch = useDispatch();
  const users = useSelector(state => state.users);
  const [currentUser, setCurrentUser] = useState(null);
  const [currentIndex, setCurrentIndex] = useState(-1);
  const [searchEmail, setSearchEmail] = useState("");
  useEffect(() => {
    dispatch(retrieveUsers());
  }, [dispatch]);
  const onChangeSearchEmail = e => {
    const searchEmail = e.target.value;
    setSearchEmail(searchEmail);
  };
  const refreshData = () => {
    setCurrentUser(null);
    setCurrentIndex(-1);
  };
  const setActiveUser = (user, index) => {
    setCurrentUser(user);
    setCurrentIndex(index);
  };
  const findByEmail = () => {
    refreshData();
    dispatch(findUsersByEmail(searchEmail));
  };
  return (
    <div className="list row">
      <div className="col-md-9">
        <div className="input-group mb-3">
          <input
            type="text"
            className="form-control"
            id="searchByEmail"
            placeholder="Search by email"
            value={searchEmail}
            onChange={onChangeSearchEmail}
          />
          <div className="input-group-append">
            <button
              className="btn btn-warning m-2"
              type="button"
              onClick={findByEmail}
            >
              Search
            </button>
          </div>
        </div>
      </div>
      <div className="col-md-5">
        <h4>User List</h4>
        <ul className="list-group">
          {users &&
            users.map((user, index) => (
              <li
                className={
                  "list-group-item " + (index === currentIndex ? "active" : "")
                }
                onClick={() => setActiveUser(user, index)}
                key={index}
              >
                <div className="row">
                  <div className="col-md-2">{user.id}</div>
                  <div className="col-md-10">{user.email}</div>
                </div>
              </li>
            ))}
        </ul>
        <Link to="/add-user"
          className="btn btn-warning mt-2 mb-2"
        >
          Create a user
        </Link>
      </div>
      <div className="col-md-7">
        {currentUser ? (
          <div>
            <h4>User</h4>
            <div className="row">
              <div className="col-md-3 fw-bold">ID:</div>
              <div className="col-md-9">{currentUser.id}</div>
            </div>
            <div className="row">
              <div className="col-md-3 fw-bold">Username:</div>
              <div className="col-md-9">{currentUser.username}</div>
            </div>
            <div className="row">
              <div className="col-md-3 fw-bold">Email:</div>
              <div className="col-md-9">{currentUser.email}</div>
            </div>
            <div className="row">
              <div className="col-md-3 fw-bold">First Name:</div>
              <div className="col-md-9">{currentUser.firstName}</div>
            </div>
            <div className="row">
              <div className="col-md-3 fw-bold">Last Name:</div>
              <div className="col-md-9">{currentUser.lastName}</div>
            </div>
            <Link
              to={"/user/" + currentUser.id}
              className="btn btn-warning mt-2 mb-2"
            >
              Edit
            </Link>
          </div>
        ) : (
          <div>
            <br />
            <p>Please click on a user for details ...</p>
          </div>
        )}
      </div>
    </div>
  );
};
export default UserList;

It should be noted that, despite having been stripped down for simplicity, the above sample code might still have included a little bit too much details for React beginners. For now, the primary goal is to highlight how an action is powered by function dispatch() in accordance with a certain UI event to interactively update state in the Redux central store thru a corresponding reducer function.

In the next blog post, we’ll dive a little deeper into React components and how they have evolved from the class-based OOP (object oriented programming) to the FP (functional programming) style with React Hooks.

Node.js, PostgreSQL With Sequelize

A recent project has prompted me to adopt Node.js, a popular by-design lean and mean server, as the server-side tech stack. With the requirement for a rather UI feature-rich web application, I include React (a.k.a. ReactJS) as part of the tech stack. A backend database is needed, so I pick PostgreSQL. Thus, this is a deviation from the Scala / Akka Actor / Akka Stream tech stack I’ve been using in recent years.

PostgreSQL has always been one of my favorite database choices whenever a robust RDBMS with decent scalability is required for a given R&D project. With Node.js being the chosen app/API server and React the UI library for the project at hands, I decided to use Sequelize, a popular ORM tool in the Node ecosystem, as the ORM tool.

First and foremost, I must acknowledge the effective documentation on Sequelize’s official website, allowing developers new to it to quickly pick up the essential know-how’s from:

to the more advanced topics like:

Getting started

Assuming the Node.js module is already in place, to install PostgreSQL driver and Sequelize, simply do the following under the Node project root subdirectory:

$ npm install --save pg pg-hstore
$ npm install --save sequelize

Next, create a configuration script ${node-project-root}/app/config/db.config.js for PostgreSQL like below:

module.exports = {
  HOST: "localhost",
  USER: "leo",
  PASSWORD: "changeme!",
  DB: "leo",
  dialect: "postgres",
  pool: {
    max: 5,
    min: 0,
    acquire: 30000,
    idle: 10000
  }
};

For the data model, let’s create script files for a few sample tables under ${node-project-root}/app/models/:

# user.model.js 

module.exports = (sequelize, Sequelize) => {
  const User = sequelize.define("users", {
    username: {
      type: Sequelize.STRING
    },
    email: {
      type: Sequelize.STRING
    },
    password: {
      type: Sequelize.STRING
    },
    firstName: {
      type: Sequelize.STRING
    },
    lastName: {
      type: Sequelize.STRING
    }
  });
  return User;
};
# role.model.js

module.exports = (sequelize, Sequelize) => {
  const Role = sequelize.define("roles", {
    id: {
      type: Sequelize.INTEGER,
      primaryKey: true
    },
    name: {
      type: Sequelize.STRING
    }
  });
  return Role;
};
# order.model.js

module.exports = (sequelize, Sequelize) => {
  const Order = sequelize.define("orders", {
    orderDate: {
      type: Sequelize.DATE
    },
    userId: {
      type: Sequelize.INTEGER
    },
    // add other attributes here ...
  });
  return Order;
};
# item.model.js

module.exports = (sequelize, Sequelize) => {
  const Item = sequelize.define("items", {
    serialNum: {
      type: Sequelize.STRING
    },
    orderId: {
      type: Sequelize.INTEGER
    },
    // add other attributes here ...
  });
  return Item;
};

Sequelize instance

Note that within the above data model scripts, each of the table entities is represented by a function with two arguments — Sequelize refers to the Sequelize library, whereas sequelize is an instance of it. The instance is what’s required to connect to a given database. It has a method define() responsible for specifying the table definition including the table attributes and the by-default pluralized table name.

Also note that it looks as though the typical primary key column id is missing in most of the above table definitions. That’s because Sequelize would automatically create an auto-increment integer column id if none is specified. For a table intended to be set up with specific primary key values, define it with explicitly (similar to how table roles is set up in our sample models).

The Sequelize instance is created and initialized within ${node-project-root}/app/models/index.js as shown below.

# ${node-project-root}/app/models/index.js

const config = require("../config/db.config.js");
const Sequelize = require("sequelize");
const sequelize = new Sequelize(
  config.DB,
  config.USER,
  config.PASSWORD,
  {
    host: config.HOST,
    dialect: config.dialect,
    pool: {
      max: config.pool.max,
      min: config.pool.min,
      acquire: config.pool.acquire,
      idle: config.pool.idle
    }
  }
);
const db = {};
db.Sequelize = Sequelize;
db.sequelize = sequelize;
db.user = require("../models/user.model.js")(sequelize, Sequelize);
db.role = require("../models/role.model.js")(sequelize, Sequelize);
db.order = require("../models/order.model.js")(sequelize, Sequelize);
db.item = require("../models/item.model.js")(sequelize, Sequelize);
db.role.belongsToMany(db.user, {
  through: "user_role"
});
db.user.belongsToMany(db.role, {
  through: "user_role"
});
db.user.hasMany(db.order, {
  as: "order"
});
db.order.belongsTo(db.user, {
  foreignKey: "userId",
  as: "user"
});
db.order.hasMany(db.item, {
  as: "item"
});
db.item.belongsTo(db.order, {
  foreignKey: "orderId",
  as: "order"
});
db.ROLES = ["guest", "user", "admin"];
module.exports = db;

Data model associations

As can be seen from the index.js data model script, after a Sequelize instance is instantiated, it loads the database configuration information from db.config.js as well as the table definitions from the individual model scripts.

Also included in the index.js script are examples of both the one-to-many and many-to-many association types. For instance, the relationship between table users and orders is one-to-many with userId as the foreign key:

db.user.hasMany(db.order, {
  as: "order"
});
db.order.belongsTo(db.user, {
  foreignKey: "userId",
  as: "user"
});

whereas relationship between users and roles is many-to-many.

db.role.belongsToMany(db.user, {
  through: "user_role"
});
db.user.belongsToMany(db.role, {
  through: "user_role"
});

Database schema naming conventions

Contrary to the camelCase naming style for variables in programming languages such as JavaScript, Java, Scala, conventional RDBMSes tend to use snake_case naming style for table and column names. To accommodate the different naming conventions, Sequelize automatically converts database schemas’ snake_case style to JavaScript objects’ camelCase. To keep the database schema in snake_case style one can customize the Sequelize instance by specifying underscored: true within the define {} segment as shown below.

As mentioned in an earlier section, Sequelize pluralizea database table names by default. To suppress the auto-pluralization, specifying also freezeTableName: true within define {} followed by defining the table with singular names within the individual model scripts.

const sequelize = new Sequelize(
  config.DB,
  config.USER,
  config.PASSWORD,
  {
    host: config.HOST,
    dialect: config.dialect,
    pool: {
      max: config.pool.max,
      min: config.pool.min,
      acquire: config.pool.acquire,
      idle: config.pool.idle
    },
    define: {
      underscored: true,
      freezeTableName: true
    }
  }
);

An “inconvenience” in PostgreSQL

Personally, I prefer keeping database table names singular. However, I have a table I’d like to name it user which is disallowed within PostgreSQL’s default schema namespace. That’s because PostgreSQL makes user a reserved keyword.

A work-around would be to define a custom schema that serves as a namespace in which all user-defined entities are contained. An inconvenient consequence is that when performing queries using tools like psql, one would need to alter the schema search path from the default public schema to the new one.

ALTER ROLE leo SET search_path TO myschema;

After weighing the pros and cons, I decided to go with Sequelize‘s default pluralized table naming. Other than this minor inconvenience, I find Sequelize an easy-to-pick-up ORM for wiring programmatic CRUD operations with PostgreSQL from within Node’s controller modules.

The following sample snippet highlights what a simple find-by-primary-key select and update might look like in a Node controller:

const db = require("../models");
const User = db.user;
...

exports.find = (req, res) => {
  const id = req.params.id;
  User.findByPk(id)
    .then(data => {
      if (data) {
        res.send(data);
      } else {
        res.status(404).send({
          message: `ERROR finding user with id=${id}!`
        });
      }
    })
    .catch(err => {
      res.status(500).send({
        message: `ERROR retrieving user data!`
      });
    });
};

exports.update = (req, res) => {
  const id = req.params.id;
  User.update(req.body, {
    where: { id: id }
  })
    .then(num => {
      if (num == 1) {
        res.send({
          message: "User was updated successfully!"
        });
      } else {
        res.send({
          message: `ERROR updating user with id=${id}!`
        });
      }
    })
    .catch(err => {
      res.status(500).send({
        message: `ERROR updating user data!`
      });
    });
};

In the next blog post, we’ll shift our focus towards the popular UI library React and how state changes propagate across the UI components and the React Redux central store.

Ethereum-compatible NFT On Avalanche

While blockchain has been steadily gaining increasing attention from the general public over the past couple of years, it’s NFT, short for non-fungible token, that has recently taken the center stage. In particular, NFT shines in the area of provenance of authenticity. By programmatically binding a given asset to a unique digital token referencing immutable associated transactions on a blockchain, the NFT essentially serves as the “digital receipt” of the asset.

Currently Ethereum is undergoing a major upgrade to cope with future growth of the blockchain platform which has been suffering from low transaction rate and high gas fee due to the existing unscalable Proof of Work consensus algorithm. As described in a previous blockchain overview blog post, off-chain solutions including bridging the Ethereum main chain with layer-2 subchains such as Polygon help circumvent the performance issue.

Avalanche

Some layer-1 blockchains support Ethereum’s NFT standards (e.g. ERC-721, ERC-1155) in addition to providing their own native NFT specs. Among them is Avalanche which has been steadily growing its market share (in terms of TVL), trailing behind only a couple of prominent layer-1 blockchains such as Solana and Cardano.

With separation of concerns (SoC) being one of the underlying design principles, Avalanche uses a subnet model in which validators on the subnet only operate on the specific blockchains of their interest Also in line with the SoC design principle, Avalanche comes with 3 built-in blockchains each of which serves specific purposes with its own set of API:

  • Exchange Chain (X-Chain) – for creation & exchange of digital smart assets (including its native token AVAX) which are bound to programmatic governance rules
  • Platform Chain (P-Chain) – for creating & tracking subnets, each comprising a dynamic group of stake holders responsible for consensually validating blockchains of interest
  • Contract Chain (C-Chain) – for developing smart contract applications

NFT on Avalanche

Avalanche allows creation of native NFTs as a kind of its smart digital assets. Its website provides tutorials for creating such NFTs using its Go-based AvalancheGo API. But perhaps its support of the Ethereum-compatible NFT standards with much higher transaction rate and lower cost than the existing Ethereum mainnet is what helps popularize the platform.

In this blog post, we’re going to create on the Avalanche platform ERC-721 compliant NFTs which require programmatic implementation of their sale/transfer terms in smart contracts. C-Chain is therefore the targeted blockchain. And rather than deploying our NFTs on the Avalanche mainnet, we’ll use the Avalanche Fuji Testnet which allows developers to pay for transactions in test-only AVAX tokens freely available from some designated crypto faucet.

Scaffold-ETH: an Ethereum development stack

A code repository of comprehensive Ethereum-based blockchain computing functions, Scaffold-ETH, offers a suite of tech stacks best for fast prototyping development along with sample code for various use cases of decentralized applications. The stacks include Solidity, Hardhat, Ether.js and ReactJS.

The following softwares are required for installing Scaffold-ETH, building and deploying NFT smart contracts:

Launching NFTs on Avalanche using a customized Scaffold-ETH

For the impatient, the revised code repo is at this GitHub link. Key changes made to the original branch in Scaffold-ETH will be highlighted at the bottom of this post.

To get a copy of Scaffold-ETH repurposed for NFTs on Avalanche, first git-clone the repo:

git clone https://github.com/oel/avalanche-scaffold-eth-nft avax-scaffold-eth-nft

Next, open up a couple of shell command terminals and navigate to the project-root (e.g. avax-scaffold-eth-nft).

Step 1: From the 1st shell terminal, install the necessary dependent modules.

cd avax-scaffold-eth-nft/
yarn install

Step 2: From the 2nd terminal, specify an account as the deployer.

Choose an account that owns some AVAX tokens (otherwise, get free tokens from an AVAX faucet) on the Avalanche Fuji testnet and create file packages/hardhat/mnemonic.txt with the account’s 12-word mnemonic in it.

cd avax-scaffold-eth-nft/
yarn account
yarn deploy --network fujiAvalanche

For future references, the “deployed at” smart contract address should be saved. Transactions oriented around the smart contract can be reviewed at snowtrace.io.

Step 3: Back to the 1st terminal, start the Node.js server at port# 3000.

yarn start

This will spawn a web page on the default browser (which should have been installed with the MetaMask extension).

Step 4: From the web browser, connect to the MetaMask account which will receive the NFTs

Step 5: Back to the 2nd terminal, mint the NFTs.

yarn mint --network fujiAvalanche

The address of the NFT recipient account connected to the browser app will be prompted. Upon successful minting, images of the NFTs should be automatically displayed on the web page.

To transfer any of the NFTs to another account, enter the address of the account to be transferred to and click “transfer”. Note that the account connected to the browser app would need to own some AVAX tokens (again if not, get free tokens from an AVAX faucet).

The web page upon successful minting should look like below:

Avalanche NFTs using Scaffold-ETH (MetaMask connected)

Key changes made to the original Scaffold-ETH branch

It should be noted that Scaffold-ETH is a popular code repo under active development. The branch I had experimented with a few months ago is already markedly different from the same branch I git-cloned for custom modification. That prompted me to clone a separate repo to serve as a “snapshot” of the branch, rather than just showing my modifications to an evolving code base.

Below are the main changes made to the Scaffold-ETH Simple NFT Example branch git-cloned on March 30:

Hardhat configuration script: packages/hardhat/hardhat.config.js

The defaultNetwork value in the original Hardhat configuration script is “localhost” by default, assuming a local instance of a selected blockchain is in place. The following change sets the default network to the Fuji testnet, whose network configuration parameters need to be added as shown below.

const defaultNetwork = "fujiAvalanche";
# const defaultNetwork = "mainnetAvalanche";
...
module.exports = {
  ...
  networks: {
    ...
    fujiAvalanche: {
      url: "https://api.avax-test.network/ext/bc/C/rpc",
      gasPrice: 225000000000,
      chainId: 43113,
      accounts: {
        mnemonic: mnemonic(),
      },
    },
    ...

Note that with the explicit defaultNetwork value set to “fujiAvalanche”, one could skip the --network fujiAvalanche command line option in the smart contract deploy and mint commands.

ReactJS main app: packages/react-app/src/App.jsx

To avoid compilation error, the following imports need to be moved up above the variable declaration section in main Node.js app.

import { useContractConfig } from "./hooks"
import Portis from "@portis/web3";
import Fortmatic from "fortmatic";
import Authereum from "authereum";

...
const targetNetwork = NETWORKS.fujiAvalanche;
# const targetNetwork = NETWORKS.mainnetAvalanche

Minting script: packages/hardhat/scripts/mint.js

A few notes:

  • The square-shaped animal icon images for the NFTs used in the minting script are from public domain sources. Here’s the link to the author’s website.
  • Node module prompt-sync is being used (thus is also added to the main package.json dependency list). It’s to avoid having to hardcode the NFT recipient address in the minting script.
  • The code below makes variable toAddress a dynamic input value and replaces the original NFT images with the square-styling images along with a modularized mintItem function.
...
const prompt = require('prompt-sync')();

const delayMS = 5000  // Increase delay as needed!

const main = async () => {

  // ADDRESS TO MINT TO:
  // const toAddress = "0x36f90A958f94F77c26614DB170a5C8a7DF062A90"
  const toAddress = prompt("Enter the address to mint to: ");

  console.log("\n\n 🎫 Minting to "+toAddress+"...\n");

  const { deployer } = await getNamedAccounts();
  const yourCollectible = await ethers.getContract("YourCollectible", deployer);

  // Item #1

  const iconCrocodile = {
    "description": "Squared Croc Icon",
    "external_url": "https://blog.genuine.com/",
    "image": "https://blog.genuine.com/wp-content/uploads/2022/03/Crocodile-icon.png",
    "name": "Squared Crocodile",
    "attributes": [
       {
         "trait_type": "Color",
         "value": "Green"
       }
    ]
  }
  mintItem(iconCrocodile, yourCollectible, toAddress)

  await sleep(delayMS)

  // Item #2

  const iconDuck = {
    "description": "Squared Duck Icon",
    "external_url": "https://blog.genuine.com/",
    "image": "https://blog.genuine.com/wp-content/uploads/2022/03/Duck-icon.png",
    "name": "Squared Duck",
    "attributes": [
       {
         "trait_type": "Color",
         "value": "Yellow"
       }
    ]
  }
  mintItem(iconDuck, yourCollectible, toAddress)

  await sleep(delayMS)

  // Item #3

  const iconEagle = {
    "description": "Squared Eagle Icon",
    "external_url": "https://blog.genuine.com/",
    "image": "https://blog.genuine.com/wp-content/uploads/2022/03/Eagle-icon.png",
    "name": "Squared Eagle",
    "attributes": [
       {
         "trait_type": "Color",
         "value": "Dark Gray"
       }
    ]
  }
  mintItem(iconEagle, yourCollectible, toAddress)

  await sleep(delayMS)

  // Item #4

  const iconElephant = {
    "description": "Squared Elephant Icon",
    "external_url": "https://blog.genuine.com/",
    "image": "https://blog.genuine.com/wp-content/uploads/2022/03/Elephant-icon.png",
    "name": "Squared Elephant",
    "attributes": [
       {
         "trait_type": "Color",
         "value": "Light Gray"
       }
    ]
  }
  mintItem(iconElephant, yourCollectible, toAddress)

  await sleep(delayMS)

  // Item #5

  const iconFish = {
    "description": "Squared Fish Icon",
    "external_url": "https://blog.genuine.com/",
    "image": "https://blog.genuine.com/wp-content/uploads/2022/03/Fish-icon.png",
    "name": "Squared Fish",
    "attributes": [
       {
         "trait_type": "Color",
         "value": "Blue"
       }
    ]
  }
  mintItem(iconFish, yourCollectible, toAddress)

  await sleep(delayMS)

  console.log("Transferring Ownership of YourCollectible to "+toAddress+"...")

  await yourCollectible.transferOwnership(toAddress, { gasLimit: 8000000 });  // Increase limit as needed!

  await sleep(delayMS)

  ...
}

async function mintItem(item, contract, mintTo, limit = 8000000) {  // Increase limit as needed!
  console.log("Uploading `%s` ...", item.name)
  const uploaded = await ipfs.add(JSON.stringify(item))

  console.log("Minting `%s` with IPFS hash ("+uploaded.path+") ...", item.name)
  await contract.mintItem(mintTo,uploaded.path,{gasLimit:limit})
}
...