The objective of this work was to develop a system to monitor the materials of a production line using IoT technology. Currently, the process of monitoring and replacing parts depends on manual services. For this, load cells, microcontroller, Broker MQTT, Telegraf, InfluxDB, and Grafana were used. It was implemented in a workflow that had the function of collecting sensor data, storing it in a database, and visualizing it in the form of weight and quantity. With these developed solutions, he hopes to contribute to the logistics area, in the replacement and control of materials.
Leonardo Henrique da Paixão
Simple Customer Registration Crud, with the functions of Register, Delete, Edit and List new and old customers registered in the Database. PostgreSQL was used as DBMS, due to its easy use and practicality.
Crud de Cadastro a Clientes simples, com as funções de Registrar, Deletar, Editar e Listar novos e antigos clientes cadastrado no Banco de Dados. Foi utilizado como SGBD o PostgreSQL, pelo seu facil uso e praticidade.
It is used the Neural Network NEAT, Neural Evolution Augmenting Topology, and the Pygame Library of Python to develop this Project. The neural network using Python will allow the user to define the inputs (Inputs) and Outputs (Outputs), that is, it will define the input information (settings) and according to that input a specific result. The job of neural networks is to play the game with different inputs until you can learn the "perfect" way out.
É utilizado a Rede Neural NEAT, Neural Evolution Augmenting Topology, e a Biblioteca Pygame do Python para desenvolver esse Projeto. A rede neural utilizando o Python vai permitir com que o usuário defina os inputs(Entradas) e Outputs(Saidas), ou seja, vai definir as informações de entrada(configurações) e de acordo com essa entrada um resultado especifico. O trabalho das redes neurais é jogar o jogo com diferentes entradas até que consiga aprender a saida "perfeita".
I used only one layer, because it was a simple neural network, where it receives the inputs, and with them you can create intermediate layers or not, then through this decision, it generates the outputs.
I learned a lot from Grafana, especially the issue of data monitoring, as it is easy to use, I learned how to create quick and simple dashboards. InfluxDB, I didn't know any other types of DBMS, I only knew about relational DBMS or not, but the difference was the scalability of both, but with influxDB, I knew how a time series DBMS works and finally, Telegraf, which is from the same company as InfluxDB, as I used the Windows Operating System, Telegraf tools was the first in the industry, in addition, it has complete documentation, facilitating its use, I learned a lot about connections, without having to make scripts to collect the data.