Code Monkey home page Code Monkey logo

tfbook's Introduction

Hi there, welcome to my Github Page 👋

  • 💬 Ask me about Artificial Intelligence or Google
  • 📫 How to reach me: [email protected]
  • 😄 Pronouns: he/him
  • ⚡ Fun fact: Father to Chris and Claudia Moroney

Learn more about what I do by visiting my website!

Laurence's GitHub stats

tfbook's People

Contributors

futtetennista avatar lmoroney avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tfbook's Issues

Chapter12:tflite-transferlearning.ipynb the index of range function be wrong

In line 2 on the third paragraph from last of tflite-transferlearning.ipynb for chapter 12,range function end index should be 100,not 99. if the end index is 99,the last index will be 98,not 99.

The same flaw in line 2 on the last paragraph.

By the way, the sentence “index=73” in the last paragraph is useful. In the loop "for index in range(0,99)", the index value will be reassign.

Sorry for bad English and hope I made this bug clear.

5. Introduction To Natural Language Processing: Getting Text from JSON Files

Hi,

If anyone is downloading the sarcasm dataset directly from the Kaggle website, this alteration to loading the JSON data will properly execute:

datastore = []
data = open("tmp\Sarcasm_Headlines_Dataset.json", 'r').readlines()
for line in data:
    datastore.append(json.loads(line))

sentences = [] 
labels = []
urls = []
for item in datastore:
    sentence = item['headline'].lower()
    sentence = sentence.replace(",", " , ")
    sentence = sentence.replace(".", " . ")
    sentence = sentence.replace("-", " - ")
    sentence = sentence.replace("/", " / ")
    soup = BeautifulSoup(sentence)
    sentence = soup.get_text()
    words = sentence.split()
    filtered_sentence = ""
    for word in words:
        word = word.translate(table)
        if word not in stopwords:
            filtered_sentence = filtered_sentence + word + " "
    sentences.append(filtered_sentence)
    labels.append(item['is_sarcastic'])
    urls.append(item['article_link'])

Error in colab chap 17

Hi Mr Moroney,

Purchased your book.
When running chap 17 example colab found here:
https://github.com/lmoroney/tfbook/blob/master/chapter17/convert_basic.ipynb
I get this error message when running the conversion:

2021-11-24 23:22:42.307496: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:39] Overriding allow_growth setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), NOT tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory.
Traceback (most recent call last):

I then use this command tf.keras.models.save_model() to save the model and it seems to work.
However when I download group1-shard1of1.bin and model.json and use them with the code:

var savepar;
var savepar2;
async function run(){
const MODEL_URL = 'model.json';
const model = await tf.loadLayersModel(MODEL_URL);
console.log(model.summary());
const input = tf.tensor2d([10.0], [1, 1]);
const result = model.predict(input);
alert(result);
model.getWeights()[0].data().then(PromiseResult => savepar = PromiseResult[0]);
model.getWeights()[1].data().then(PromiseResult => savepar2 = PromiseResult[0]);
}

The savepar and savepar2 hold weights much different from ~ -1 and 2.

Do you have any suggestion as to why results are off?

Kind regards

Some many open issue

Seems like author only selling book that comes with broken codes and not willing to response to readers on code issues.

Chapter2: model fitting runs only on 1875 training records/epoch (not on 60,000) of fashion_mnist from keras

Recently read chapter 2 and tried the code provided. I am a bit confused about why the model is only fitting 1875 records per epoch while in the book it is 60,000 per epoch.

I also measure the length for this fashion_mnist from keras. And it is 60,000 for the training set while 10,000 for the test set.

Tried and tested the same code on Google Colab, Jupyter Notebook as well as Pycharm but the issue is still the same.
(Also, reshaped the input for the test and training set. And ran the code with old TensorFlow version 2.3.0 that was referenced in the book - Previously was using latest Tensor flow version 2.7.0)

https://colab.research.google.com/drive/1hkcax1f-4_t5N7FI6sCTyD1zZaBfM92n?authuser=1#scrollTo=eIF4S7sdIP_K&line=2&uniqifier=1

Issue on chapter 10 optimizer

On chapter 10,
optimizer=SGD(lr=1e-6, momentum=0.9) breaks the model trainning, making it predict always "nan".
Leaving it blank (that defaults to adam optimizer), fix the issue.

model.compile(loss="mse")

horse or human inceptionv3 miscatagorizing

Hello

I was working on the horse or human transfer learning section. I am using the datasets for training and validation from the urls you supply. No matter what horse image I try the model classifies it as human. I figured my local setup might have issues so I opened and ran the colab file for transfer learning in the chapter 3 folder. I received the same incorrect results using a random horse picture from the web as well as when I uploaded some of the validation horse images from the colab project.

can`t start with TensorFlow Serving from Docker

hi!
I got stuck in Chapter 19. Deployment with TensorFlow Serving starting TensorFlow Serving from docker image.

When i pass docker run -t --rm -p 8501:8501 -v "$TESTDATA/saved_model_half_plus_two_cpu:/models/half_plus_two" -e MODEL_NAME=half_plus_two tensorflow/serving &
I got error:
$ /usr/bin/tf_serving_entrypoint.sh: line 3: 6 Illegal instruction (core dumped) tensorflow_model_server --port=8500 --rest_api_port=8501 --model_name=$ {MODEL_NAME} --model_base_path=${MODEL_BASE_PATH}/${MODEL_NAME} "$@"

And curl command doesnt work -
curl -d '{"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:8501/v1/models/half_plus_two:predict

error:
D:\Users\al>curl -d '{"instances": [1.0, 2.0, 5.0]}' -X POST http://localhost:85 01/v1/models/half_plus_two:predict curl: (3) bad range in URL position 2: [1.0, ^

I`m on Windows 7.

Glove.Twitter.27B.25d 404

!wget --no-check-certificate \
    https://storage.googleapis.com/laurencemoroney-blog.appspot.com
           /glove.twitter.27B.25d.zip \
    -O /tmp/glove.zip

<Error>
  <Code>NoSuchKey</Code>
  <Message>The specified key does not exist.</Message>
  <Details>No such object: laurencemoroney-blog.appspot.com/glove.twitter.27B.25d.zip</Details>
</Error>

chapter3/Horse_or_Human_WithAugmentation.ipynb - same files used for train and validation

Hi,

Before I forget, thanks again for writing these tutorials. I'm finding them extremely helpful as I prepare for the TF Developer's Exam.

In chapter3, the URLs for the train and validation sets in Horse_or_Human_WithAugmentation.ipynb both point to the same zip file.

!wget --no-check-certificate \
    https://storage.googleapis.com/laurencemoroney-blog.appspot.com/horse-or-human.zip \
    -O /tmp/horse-or-human.zip

!wget --no-check-certificate \
    https://storage.googleapis.com/laurencemoroney-blog.appspot.com/horse-or-human.zip \
    -O /tmp/validation-horse-or-human.zip

The URL for the validation set is likely this one:

https://storage.googleapis.com/laurencemoroney-blog.appspot.com/validation-horse-or-human.zip 

There's also a typo a few cells below.

validation_generator = train_datagen.flow_from_directory(
        '/tmp/vallidation-horse-or-human',  # should be 'validation'
        target_size=(300, 300),
        class_mode='binary')

'/tmp/vallidation-horse-or-human', should be '/tmp/validation-horse-or-human',.

Cheers,

  • MC

Chapter 6/sarcasm_swivel.ipynb contains a cell from another NB

The eighth cell in sarcasm_swivel.ipynb appears to be from a different notebook, since it references two variables (training_padded and testing_padded) that have not yet been defined.

# Need this block to get it to work with TensorFlow 2.x
import numpy as np
training_sentences = np.array(training_padded)
training_labels = np.array(training_labels)
testing_padded = np.array(testing_padded)
testing_labels = np.array(testing_labels)

Fetch API cannot load iris csv. URL scheme "file" is not supported.

I was working through the Iris example in Chapter 15 and trying to replicate it on my computer. Trying to open the html file in my browser, I got an error about fetching a url beginning with "file://" Digging around, I saw that tf.data.csv uses fetch, which in turn does not support fetching local files. I tried to download Brackets, as was suggested in the book, but it appears it is reaching end of support and can no longer be downloaded. Since my usual editor is VSCode, I found that using the Live Server extension can be used as an alternative to serve your files from localhost and fix the the fetch issue. Hope this helps anyone who might face this issue in the future.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.