npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

spark-property-manager

v1.1.19

Published

Spark Real Estate Management Application

Downloads

71

Readme

Spark Property Manager

A Real Estate Property Management WebApplication to manage Bookkeeping and WorkOrder easy. Write Up expense to each property unit with uploading receipt image or bulk upload your bank account statement as flat file with configuring columns

  • Technologies: Nodejs, postgresql

Step To Setup

Postgresql

$ sudo apt-get update
$ sudo apt-get install postgresql postgresql-contrib

Create User and Database

$ createuser -P -s dbusername --createdb

If that doesn't create user with creating db

$ sudo -u postgres psql
# CREATE USER username WITH PASSWORD 'password';
# ALTER USER username SUPERUSER;
# CREATE DATABASE dbname OWNER username;
# \q

Login by the created user and create pgcrypto extension for password encryption

$ psql -U username -d dbname
# CREATE EXTENSION pgcrypto;
# \q

Memcached

Install and start Memcached

$ sudo apt-get update
$ sudo apt-get install memcached
$ sudo apt-get install libmemcached-tools
$ sudo systemctl restart memcached

Install Nodejs

nodejs install guide

Download spark-property-Manager

$ git clone https://github.com/wsapiens/spark-property-manager.git

or download tarball by npm

$ npm pack spark-property-manager

$ tar -xvf spark-property-manager-{version}.tgz

Download dependencies

$ cd spark-property-manager
spark-property-manager $ npm install

Run database migration

First, setup sequelize CLI config.json

$ vi config/config.json
{
  "development": {
    "username": "dbuser",
    "password": "dbpass",
    "database": "dbname",
    "host": "127.0.0.1",
    "dialect": "postgres"
  },

}

Run DB migration and generate seed data

$ node_modules/.bin/sequelize db:migrate
$ node_modules/.bin/sequelize db:seed:all

Create base company and initial login user after running DB migration and generating seed data

$ psql -U username -d dbname
# SELECT COUNT(*) FROM company; -- it would be zero
# INSERT INTO company(name) VALUES ('base company'); -- create base comapny
# CREATE EXTENSION pgcrypto;
# INSERT INTO login_user(company_id, email, password, is_admin, is_manager) VALUES (1, 'email', crypt('password', gen_salt('bf')), true, true); -- create a login user with email as username and password as password for the base company
# \q

if sequelize db migration doesn't work, then load up from schema.sql which will create base company and initial login user

spark-property-manager$ psql -U username -d Database -a -f schema.sql

Generate self-signed cert and key to run on https

$ sudo openssl req -x509 -nodes -days 365 -newkey rsa:2048 -keyout apache-selfsigned.key -out apache-selfsigned.crt

Copy app.properties.TEMPLATE to app.properties and update app.properties accordingly to your environment

$ cp app.properties.TEMPLATE app.properties
$ vi app.properties
  • app.properties example
# contents of properties file
[db]
hostname = postgresql.host.com
port = 5432
name = dbname
dialect = postgres
username = dbuser
password = dbpass

[app]
hostname = localhost
port = 8080
sessionSecret = secret
memcachedHost = 127.0.0.1:11211
memcachedSecret = secret
https = false
serverkey = /path/to/server.key
servercert = /path/to/server.crt
url = http://localhost:8080

[log]
file = app.log
level = error

[smtp]
username = smtpUsername
password = smtpPassword
hostname = smtpHostname
port = 465
ssl = true
tls = false

If you setup this on cloud environment with domain (Named IP Address), please update url property accordingly, so account creation notification email can include correct url of this app url = http://your.domain.com:8080

Encrypt database password

  • encrypt db password from command line
spark-property-manager$ node
> var crypto = require('./util/crypto');
> crypto.encrypt('mypass');
'a199/unJEhzdS5lfoF3sQe1haMc5kg=='
  • put the encrypted password with '[encrypt]' prefix into db password field on app.properties
# contents of properties file
[db]
hostname = postgresql.host.com
port = 5432
name = dbname
dialect = postgres
username = dbuser
password = [encrypt]a199/unJEhzdS5lfoF3sQe1haMc5kg==

Static code analysis by jshint and grunt

$ npm i -g grunt-cli
$ grunt
Running "jshint:files" (jshint) task
>> 46 files lint free.

Done.

Static code analysis by ESLint

.eslintrc.js file contains ESLint configurations with rules

$ npm run lint

> [email protected] lint
> eslint --ext .js app.js bin config email log migrations models routes util seeders

To recreate the configuration instead of editting the existing one

$ npm init @eslint/config

Run unit test by mocha

  • install mocha globally
$ npm i -g mocha
$ mocha
  • install mocha locally
$ npm i mocha
$ node_modules/.bin/mocha

util
  getImportAmount()
    ✓ get negative amount for positive return amount
    ✓ get negative amount for negative return amount
    ✓ get postive amount for positive sale amount
    ✓ get positive amount for negative sale amount
  getImportDescription()
    ✓ get description with return mark
    ✓ get description without return mark
  getRandomRGB()
    ✓ get RGB number list

crypto
  encrypt()
    ✓ test encrypt
  decrypt()
    ✓ test decrypt


9 passing (23ms)

Run Application

$ npm start

Run Application by using Process Manager PM2

PM2 provides production level process management pm2 install guide

  • install pm2
$ npm install pm2 -g
  • run application by pm2
$ pm2 start ./bin/server.js --name "spark-property-manager" -i 8 -l pm2.log
[PM2] Starting /Users/spark/workspace3/spark-property-manager/bin/server.js in cluster_mode (8 instances)
[PM2] Done.
┌────────────────────────┬────┬─────────┬───────┬────────┬─────────┬────────┬──────┬───────────┬───────┬──────────┐
│ App name               │ id │ mode    │ pid   │ status │ restart │ uptime │ cpu  │ mem       │ user  │ watching │
├────────────────────────┼────┼─────────┼───────┼────────┼─────────┼────────┼──────┼───────────┼───────┼──────────┤
│ spark-property-manager │ 0  │ cluster │ 35491 │ online │ 0       │ 2s     │ 0%   │ 83.1 MB   │ spark │ disabled │
│ spark-property-manager │ 1  │ cluster │ 35494 │ online │ 0       │ 2s     │ 1%   │ 83.5 MB   │ spark │ disabled │
│ spark-property-manager │ 2  │ cluster │ 35511 │ online │ 0       │ 2s     │ 3%   │ 83.6 MB   │ spark │ disabled │
│ spark-property-manager │ 3  │ cluster │ 35528 │ online │ 0       │ 1s     │ 13%  │ 83.5 MB   │ spark │ disabled │
│ spark-property-manager │ 4  │ cluster │ 35547 │ online │ 0       │ 1s     │ 55%  │ 82.1 MB   │ spark │ disabled │
│ spark-property-manager │ 5  │ cluster │ 35564 │ online │ 0       │ 1s     │ 104% │ 75.2 MB   │ spark │ disabled │
│ spark-property-manager │ 6  │ cluster │ 35581 │ online │ 0       │ 0s     │ 95%  │ 54.8 MB   │ spark │ disabled │
│ spark-property-manager │ 7  │ cluster │ 35602 │ online │ 0       │ 0s     │ 77%  │ 35.8 MB   │ spark │ disabled │
└────────────────────────┴────┴─────────┴───────┴────────┴─────────┴────────┴──────┴───────────┴───────┴──────────┘
 Use `pm2 show <id|name>` to get more details about an app
  • stop application by pm2
$ pm2 stop spark-property-manager
[PM2] Applying action stopProcessId on app [spark-property-manager](ids: 0,1,2,3,4,5,6,7)
[PM2] [spark-property-manager](0) ✓
[PM2] [spark-property-manager](1) ✓
[PM2] [spark-property-manager](2) ✓
[PM2] [spark-property-manager](3) ✓
[PM2] [spark-property-manager](4) ✓
[PM2] [spark-property-manager](5) ✓
[PM2] [spark-property-manager](6) ✓
[PM2] [spark-property-manager](7) ✓
┌────────────────────────┬────┬─────────┬─────┬─────────┬─────────┬────────┬─────┬────────┬───────┬──────────┐
│ App name               │ id │ mode    │ pid │ status  │ restart │ uptime │ cpu │ mem    │ user  │ watching │
├────────────────────────┼────┼─────────┼─────┼─────────┼─────────┼────────┼─────┼────────┼───────┼──────────┤
│ spark-property-manager │ 0  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 1  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 2  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 3  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 4  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 5  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 6  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
│ spark-property-manager │ 7  │ cluster │ 0   │ stopped │ 0       │ 0      │ 0%  │ 0 B    │ spark │ disabled │
└────────────────────────┴────┴─────────┴─────┴─────────┴─────────┴────────┴─────┴────────┴───────┴──────────┘
 Use `pm2 show <id|name>` to get more details about an app
  • remove application from pm2
$ pm2 delete spark-property-manager
[PM2] Applying action deleteProcessId on app [spark-property-manager](ids: 0,1,2,3,4,5,6,7)
[PM2] [spark-property-manager](0) ✓
[PM2] [spark-property-manager](1) ✓
[PM2] [spark-property-manager](2) ✓
[PM2] [spark-property-manager](3) ✓
[PM2] [spark-property-manager](4) ✓
[PM2] [spark-property-manager](5) ✓
[PM2] [spark-property-manager](6) ✓
[PM2] [spark-property-manager](7) ✓
┌──────────┬────┬──────┬─────┬────────┬─────────┬────────┬─────┬─────┬──────┬──────────┐
│ App name │ id │ mode │ pid │ status │ restart │ uptime │ cpu │ mem │ user │ watching │
└──────────┴────┴──────┴─────┴────────┴─────────┴────────┴─────┴─────┴──────┴──────────┘
Use `pm2 show <id|name>` to get more details about an app

Open by Browser

http://localhost:8080

Create Account by Valid Email address and it will send temporary password to your email

alt text

Login by temporary password sent to your email

alt text

Change password

alt text

How to record expense

  • Add Property from Property Manager View. Building unit will be added automatically as default unit

  • Add or modify Unit for the added Property from Unit Manager View

  • Add Expense with selecting Unit / Property and Expense Type, you can also upload photo copy of the receipt On the mobile, user will be prompted to take picture or choose photo in device.

  • For importing, bank / credit card statement, it needs to be flat file (.csv) format Each bank and credit card company has different formation, so need to define column number for data type/kind first. Once setup import column configuration on Import Manager view, load up .csv file to populate expenses