The main advantages of Inductive Logic Programming (ILP) are the ability of employing background knowledge and inducing human readable representations in form of a set of first-order rules. Nevertheless, ILP systems have the restriction to the classification of imperfect data such as noisy unseen data which may not be covered by any learned rules. This thesis proposes a novel flexible learning method called First-Order Logical Neural Network (FOLNN) to alleviate the restriction of rule-based systems. FOLNN is based on the feedforward neural network that integrates inductive learning from examples and background knowledge. In addition, the proposed method enables neural networks to process first-order logic programs directly. In the experiments, FOLNN has been evaluated on two domains of first-order learning problems and compared with PROGOL, the state-of-the-art ILP system. The experimental results show that our proposed method provides more accurate results than PROGOL in both datasets. Furthermore, we also evaluate FOLNN on noisy domain to see how well the learner is robust to noisy data. The results show that the accuracy of our method decreases much slower and is much higher than that of PROGOL.