IMail
- Install Node JS
- Install the required modules
npm install
- Run frontend on port 3000
npm run dev
Additional steps before runing:
- Register as Google developer and create and application and enable Gmail API access to the application Gmail API Quickstart
- Replace the Client ID and API key in the gmail_controller.js in public/js/controllers
Note: All requests send to IRIS at port 52774, change it before you run the application
You will need to change \Frontend React JS\gmail\public\js\controllers\gmail_controller.js
it is referencing http://localhost:52774/api/email... to change port and host if necessary
- Create a Namespace (IMAIL)
- Import EmailIntel.xml which includes all classes and interoperability.
- Create a Web appication (/api/email/) in IRIS - set Dispatch Class = "Email.RESTOperations" (RESTOperations.cls)
- Open the \IMAIL\Email_Intelligence.ipynb on Google Colab (https://colab.research.google.com/). Import into Colab (Upload)
- The training required dataset (Enron and Apache) is available at Google Drive
Upload the dataset to Google drive and change the pd.read_csv() to your directory in Google Drive (recommanded)
- Select Runtime as Python 3 and GPU.
- Run all code cells in order (important) up until the Evaluation section (if you just want to get the model file, don't have to do any evalutaion of the model)
Instructions/explainations of each cell are provided in the ipynb file.
Additional steps may require if you want to use Google Cloud Bucket
-
Install Python 3.6 (It's very important that you have installed the correct version of Python)
-
Install all required libraries
pip install tensorflow
pip install tensorflow-hub
pip install flask
pip install bert
pip install pandas
pip install scikit-learn
-
have your trained model in the correct folder and change the OUTPUT_DIR in the serving.py to your location
Recommond to download BERT base model to local as well so that you don't have to re-download the model every time it runs.
Download every files in the Google Bucket after the initial training, for example:
checkpoint
graph.pbtxt
model.ckpt-4503.data-00000-of-00001
model.ckpt-4503.index
model.ckpt-4503.meta
- Execute the serving.py It will start listening on port 5000
- POST to http://localhost:5000/predict
Make prediction
Request body contains json with the following format:
{ "raw": "String that you want to classify" }
- POST to http://localhost:5000/train
Incremental training
Request body contains json with the following format:
{ "text":["The phone I was purchasing yesterday on the website got a great discount","Purchasing phones from our website now and you can get a great discount"], "spam":[0,1] }