我有一个格式类似于Iris数据集的数据集,我想使用自定义分类器对其进行分析。
由于我对Python不太满意,因此我正在改编其他人的代码,谢谢:)到目前为止效果很好。但是,我想添加一个准确性得分,因为我的数据集超过了100万个数据点,而我才弄清楚了。
我使用的代码是:
# Dependencies
import tensorflow as tf
import pandas as pd
import numpy as np
print(tf.__version__)
# Make results reproducible
seed = 1234
np.random.seed(seed)
tf.set_random_seed(seed)
# Loading the dataset
dataset = pd.read_csv('Iris_Dataset.csv')
dataset = pd.get_dummies(dataset, columns=['Species']) # One Hot Encoding
values = list(dataset.columns.values)
#y is the CellType value x is the data
# Python is a 0 index language
y = dataset[values[-3:]]
y = np.array(y, dtype='float32')
X = dataset[values[1:-3]]
X = np.array(X, dtype='float32')
# Shuffle Data
indices = np.random.choice(len(X), len(X), replace=False)
X_values = X[indices]
y_values = y[indices]
# Creating a Train and a Test Dataset
test_size = 10
X_test = X_values[-test_size:]
X_train = X_values[:-test_size]
y_test = y_values[-test_size:]
y_train = y_values[:-test_size]
# Session
sess = tf.Session()
# Interval / Epochs
interval = 50
epoch = 500
# Initialize placeholders
X_data = tf.placeholder(shape=[None, 4], dtype=tf.float32)
y_target = tf.placeholder(shape=[None, 3], dtype=tf.float32)
# Input neurons : 4
# Hidden neurons : 8
# Output neurons : 3
hidden_layer_nodes = 9
# Create variables for Neural Network layers
w1 = tf.Variable(tf.random_normal(shape=[4,hidden_layer_nodes])) # Inputs -> Hidden Layer
b1 = tf.Variable(tf.random_normal(shape=[hidden_layer_nodes])) # First Bias
w2 = tf.Variable(tf.random_normal(shape=[hidden_layer_nodes,3])) # Hidden layer -> Outputs
b2 = tf.Variable(tf.random_normal(shape=[3])) # Second Bias
# Operations
hidden_output = tf.nn.relu(tf.add(tf.matmul(X_data, w1), b1))
final_output = tf.nn.softmax(tf.add(tf.matmul(hidden_output, w2), b2))
# Cost Function
loss = tf.reduce_mean(-tf.reduce_sum(y_target * tf.log(final_output), axis=0))
# Optimizer
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.001).minimize(loss)
# Initialize variables
init = tf.global_variables_initializer()
sess.run(init)
# Training
print('Training the model...')
for i in range(1, (epoch + 1)):
sess.run(optimizer, feed_dict={X_data: X_train, y_target: y_train})
if i % interval == 0:
print('Epoch', i, '|', 'Loss:', sess.run(loss, feed_dict={X_data: X_train, y_target: y_train}))
# Prediction
print()
for i in range(len(X_test)):
print('Actual:', y_test[i], 'Predicted:', np.rint(sess.run(final_output, feed_dict={X_data: [X_test[i]]})))
# Evaluate accuracy.
accuracy_score = y_target.evaluate(input_fn=y_target)["accuracy"]
print("\nTest Accuracy: {0:f}\n".format(accuracy_score))
我得到的输出是
1.4.0
2018-04-05 09:06:35.688295: I tensorflow/core/platform/cpu_feature_guard.cc:137] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
Training the model...
Epoch 50 | Loss: 11.3519
Epoch 100 | Loss: 7.53955
Epoch 150 | Loss: 6.3168
Epoch 200 | Loss: 5.58197
Epoch 250 | Loss: 5.09148
Epoch 300 | Loss: 4.74129
Epoch 350 | Loss: 4.47681
Epoch 400 | Loss: 4.26831
Epoch 450 | Loss: 4.09931
Epoch 500 | Loss: 3.95926
Actual: [ 0. 0. 1.] Predicted: [[ 0. 0. 1.]]
Actual: [ 1. 0. 0.] Predicted: [[ 1. 0. 0.]]
Actual: [ 0. 0. 1.] Predicted: [[ 0. 0. 1.]]
Actual: [ 1. 0. 0.] Predicted: [[ 1. 0. 0.]]
Actual: [ 1. 0. 0.] Predicted: [[ 1. 0. 0.]]
Actual: [ 0. 0. 1.] Predicted: [[ 0. 0. 1.]]
Actual: [ 0. 0. 1.] Predicted: [[ 0. 0. 1.]]
Actual: [ 0. 1. 0.] Predicted: [[ 0. 1. 0.]]
Actual: [ 1. 0. 0.] Predicted: [[ 1. 0. 0.]]
Actual: [ 1. 0. 0.] Predicted: [[ 1. 0. 0.]]
Traceback (most recent call last):
File "/Users/XXXX/PycharmProjects/TensorFlow1/Iris/Iris_Network.py", line 86, in <module>
accuracy_score = y_target.evaluate(input_fn=y_target)["accuracy"]
AttributeError: 'Tensor' object has no attribute 'evaluate'
Process finished with exit code 1
如您所见,只有最后两行代码是错误的。这些应该是什么?
最佳答案
您可以使用sklearn precision score function
中的分数。这样说。您的考试成绩不需要for循环。因此,您可以获得所有结果。
y_pred = np.rint(sess.run(final_output, feed_dict={X_data: X_test}))
至于分数
score = sklearn.metrics.precision_score(y_test, y_pred)
当然,您需要导入
sklearn
包。