# <center><i class="fa fa-edit"></i> Basic TensorFlow </center> ###### tags: `Internship` :::info **Goal:** - [x] Understand basics behind TensorFlow - [x] TensorFlow Graphs **Resources:** [Towards Data Science Page](https://towardsdatascience.com/understanding-lstm-and-its-quick-implementation-in-keras-for-sentiment-analysis-af410fd85b47) [Adventures in Machine Learning](https://adventuresinmachinelearning.com/neural-networks-tutorial/#first-attempt-feed-forward) [Machine Learning](https://hackmd.io/@Derni/HJQkjlnIP) ::: ### TensorFlow Graphs PURPOSE: Create computational graphs to allow parallel operations and increased efficiency. :::success **Example** $a = (b + c) * (c + 2)$ can also be expressed as: ![](https://i.imgur.com/5GweDc9.png) As a graph: ![](https://i.imgur.com/4FfdXab.png) Can perform $d = b + c$ and $e=c+2$ in parallel. ::: TensorFlow computational graph in three-layer neural network: ![](https://i.imgur.com/uQzNYLb.png) - Tensors: nodes in graph. - Multi-dimensional data arrays - Ex: input tensor is 5000 x 64 x 1 for a 64-node input layer with 5000 training samples - Rectified linear units: activation function in hidden layer after input layer - Logit layer: final output layer - Uses cross entropy as cost/loss function - Relevant tensors flow to "Gradients" block - Enter Stochastic Gradient Descent optimizer - Backpropagation and gradient descent