1 Introduction
The aim of this paper is to present the control and driving mechanism of robotic arm, which aims to grab different sized objects which can find applications within the industry or other working environments. The robotic arm must be highly functional, light weight and provide ease of attachment and control. The interdisciplinarity that characterizes Mechatronics between the notions of Mechanical Engineering, ElectronicElectrical Engineering and Computer Science is used for the selection of materials and devices to construct the arm. Furthermore it helped us to encounter any kinetic, power, torque, compatibility or other problems that would have resulted from the completion of the project . The major problem these robotic arms face is their cost ([16], [5], [14]). The main factors for their expensiveness are use of advanced actuators, too complex design and manufacturing techniques and finnaly specialized sensors for user input and control. To address the challenge we adopted a design that uses infrared(IR) sensors providing the virtual vision and low cost commercially available actuators.
2
Robot Arm and Infrared Sensor Description
The Robotic Arm is a modular arm consisting of five rotary joints plus the end effector which is a grip. The five rotating joints consist of: 1 joint for base rotation, 1 for shoulder rotation, 1 for elbow rotation, 1 for wrist rotation and 1 for grip rotation. The mechanical parts of the project were selected by Lynxmotion one by one to meet our needs and are of the AL5 type. The six servo motors by Hitec are chosen based on their torque for proper operation. The Infrared Sensor is by Sharp and is a distance sensor. The Sharp 2Y0A21 F46 composed of an integrated combination of PSD (position sensitive detector), IRED (infrared emitting diode) and signal processing circuit. The device outputs the voltage corresponding to the detection distance.
Every rotational joint of the arm is being controlled by servo motors. These motors are connected to a microcontroller (BotBoarduino) which is controlled by a computer. The sensor is placed in front of the arm, above gripper, so can ‘’read” the distance between the gripper and the reflected surface.
3 Mechanical Engineering issues
We studied the torque of the servo motors which have been chosen, to avoid any kinetic problems. Furthermore we analyze the Degrees Of Freedom and the Work Space of the arm. To control the arm, the Forward Kinematics and Inverse Kinematics has also been developed. After we measured all the materials of the arm we made the CAD through the SolidWorks software
3.1 Torque Calculation
Torque () is defined ([17],[6]) as a turning or twisting force and is calculated using the relation:
The force () acts at a length () from a pivot point. In a vertical plane the force that causing an object to fall is the acceleration due to gravity () multiplied by its mass ():
The above relation is the object’s weight ():
In Figure 1 we can see the lengths () of the links as well as the weights () of the links considered that the center of mass is located at roughly the center of its length. The in the image is the ‘’load” being held by the arm, the and are the actuators (servos). To calculate the required torque () of the motor ( servo) we use the relation:
(1)  
To calculate the required torque () of the motor ( servo):
(2)  
The torque () of the ( servo) is calculated:
(3)  
In a same manner the torque () of the ( servo):
(4) 
and finally the torque () of the ( servo):
(5) 
where: , , , , .
The weights are (Grip), (Sensor Bracket), (Wrist Bracket), (AT04) +(ASB06)+ (HUB08) =, (ASB205)+ ( ASB203) = .
The lengths are : cm, cm, cm, cm, cm. If we replace the above, to the relations (1)  (5) and the weight of (load) is zero, the torques of the motors are:

kg/cm

kg/cm

kg/cm

kg/m

kg/cm
The nominal torques of the servo motors as given by the manufacturer are:

kg/cm

kg/cm

kg/cm

kg/cm

kg/cm
From the above, we can say that the arm is capable to lift its own weight because the nominal torques of the servos overlap the calculated torques with zero load (). Now If the load is set to be the torques are:

kg/cm

kg/cm

kg/cm

kg/cm

kg/cm
We notice, if the load is , servos can cope. If we increase the weight at , then the calculated torques are:

kg/cm

kg/cm

kg/cm

kg/cm

kg/cm
Hence the maximum weight the arm can lift is approximately ~300g because the calculated torques reach the nominal torques of the servos.
3.2 DOF (Degrees Of Freedom)
The arm has six actuators, one of which is to open and close the gripper, thus not being considered as a degree of freedom. The five rotational actuators have a degree of freedom each, with a result that the hole system has a total of five degrees of freedom. If we see the DOF of the arm through a mathematical point of view, the equation describing it is the GrueblerKutzbach equation and is expressed by the relation:
where is the DOF of the system, is the number of links including the base frame, is the number of joints that have one DOF, is the number of joints that have more than one DOF
In figure 2 we can see the links (A, B, C, D, E and F) and the joints (1, 2, 3, 4 and 5) of the arm. The number of the is zero because there is no joints of two DOF in the system.
Therefore:
(6)  
3.3 Work Space
The work space of the arm is the space where the end effector can act. We compiled a code using MatLab for the 3D representation of the arm’s work space. In Figure 3 we can see the 3D presentation of the work space of the arm using the Robotics toolbox™(see [8], [Corke2011]. The top hemisphere (colored: yellow) is the actual work space of the arm and the bottom hemisphere (colored: blue) is a possible work space of the arm under certain circumstances. The diameter of the work area of the arm is approximately 40 cm .
3.4 Forward Kinematic
Forward Kinematics ([20], [12])refers to the use of kinematics equations of a robot to calculate the position of the endeffector from specified values for the joint parameters. The DenavitHartenberg parameters is the most common method being used to determine the Forward Kinematics analyses. Using this method we define the coordinate frames of the arm (Fig. 4), depending of the joints of mechanisms and then the DH parameters table (Table 1) has been calculated
Coordinate Frames has been defined with respect to DH methodology where:
a is the length of the common perpendicular between points and , is the angle between axes and is the displacement distance of points and is the angle between axis and
(7) 
Using the DH parameter table, homogeneous transformations matrixes are resulted
(8)  
The multiplication of the matrixes (7)(8) gives the table of the total homogeneous transformation, which is expressed:
(9) 
where:

is the vector representing the direction of the axis in the coordinate system

is the vector representing the direction of the axis in the coordinate system

is the vector representing the direction of the axis is the vector representing the joint’s position
where:
=
=
=
=
=
=
=
=
=
=
=
= [6]
and , .
3.5 Inverse Kinematic Analysis
In the Inverse kinematic ([20], [12])analysis we use the kinematics equations to find the desired position of the endeffector. In other words, forward kinematics uses the joint parameters to compute the configuration of a kinematic chain, and inverse kinematics reverses this calculation to determine the joint parameters that achieves a desired configuration. Calculation of the inverse kinematics problem is much more complex than forward kinematics, since there is no unique solution. In this project a geometric approach been used for solving the inverse kinematics problem. The complexity of the inverse kinematic problem increases with the number of nonzero link parameters and the geometric approach that used to solve the problem is simplest and more natural. The general idea of the geometric approach is to project the manipulator onto the  plane (Figure 5) and solve a simple trigonometry problem to find
From the projection we can see that
(10) 
The distance from the base to the edge of the grip is , therefore
(11) 
where
(12) 
A second projection of the manipulator is shown in Figure 6. From this projection we can see that the is :
(13) 
The is :
(14) 
Therefore, the angle of the shoulder joint is :
(15) 
The second solution of the angle is written:
(16) 
The elbow joint corresponds to the angle which equals to :
(17) 
The relation between the angle (grip rotation) and the angles is written:
(18) 
From the geometric approach that analyzed above, the user will have complete control of the manipulator, controlling all six arm servos. When the user changes the angles and , the angle will not change, thus not changing the point of the grip and its orientation. This is a consequence of the geometric approach (Figure 6). In other words, the three thrust mechanisms (rotation of the wrist), (grip rotation), (opening and closing of the grip) are not affected by the movement of angular movement mechanisms (base rotation), (shoulder rotation) , (elbow rotation), and the inverse . [6]
3.6
SolidWorks CADCAE
All the parts comprising the project have been measured and designed with usage of SolidWorks software. It’s a solid modeling computeraided design (CAD) and computeraided engineering (CAE) computer program. Users can see the full dimensionality (2D or 3D) of every part comprising the arm, as well as the material of every part. Additionally SolidWorks provides users with “countless” capabilities like measuring parts, mass properties, motion study, collision check etc. In Figure 7, we can see the final 3D rendering of the arm.
4 ElectricalElectronic wiring
Communication compatibility between devices and the proper powering of these devices is important for the right operation of the project [11]. Once the correct devices have been selected based on their mechanical analysis, their electrical behavior should be analyzed in order to avoid encounter any communication or powering problems. In Figure 8 we can see the electrical diagram of the project. A current source ( socket) feeds the computer’s power supply adapter () and another source feeds the BotBoarduino’s power supply adapter (). All servos are powered by the BotBoarduino’s adapter, the IR sensor is powered by the computer’s USB cable (),through the BotBoarduino’s regulator (). The red (+ positive) and black (negative) cables of the servos and IR sensor are for powering the devices, the yellow cable is for the communication with microcontroller ATMEGA328 and the computer through the USB cable (see [1],[3],[4],[9],[2],[13]).
4.1 Microcontroller Description
There are a lot of microcontroller boards in the market today with different functions, depending on the needs. The microcontroller board that was selected for this project is the BotBoarduino board because can give us the desired currents for the devices (servos and IR sensor) and can also split two sources of power for different powering needs. For example, in our project, servos are powered with 6VDC by bridging the jumper to the VS input and IR sensor is powered with 5VDC by bridging the jumper to the VL input. BotBoarduino is based on the ATMEGA328 microcontroller, it has a USB mini port that connects to a computer for programming the microcontroller. The LD29150DT50R regulator that BotBoarduino has onboard can power up to 1.5A current through the VL input. An external source of power (VS input) can also be connected onto the board, like the adapter (6V/2.25A) we use, for powering devices that need grater volts and amps, than the 5V/1.5A of VL input (see [3],[15],[9]).
4.2 Servomotors and IR Sensor
The angle of each servo controlled by the Pulse Width Modulation (PWM) value which is defined during programming. Each of the servo have an IC inside that can check the emitted PWM of the BotBoarduino and drive the servo to the desired position. Servos are powered by 6V DC by the adaptor through the BotBoarduino. The distance of IR sensor is measured by the Position Sensitive Detector (PSD) which then ‘’translate” the measured distance, through the IC of the sensor, to an output voltage. The greater the output voltage, the greater the distance. In the Figure 9 we can see the curve of the measured distance in relation with the output voltage. We can observe that the best accuracy region of the sensor is between the values of 10cm and 15cm. According to this, the distance between the sensor and the sorting objects must be between these values, for best accuracy (see [18],[3],[4]).
4.3 Power Consumptions
The power consumption of the devices of the arm, is one of the most important topics in this study. The current consumed by the servos appears below:

HS485HB = 180mA

HS805BB = 830mA

HS755HB = 285mA

HS645MG = 450mA

HS225MG = 340mA

HS422 = 180mA
The sum of the consumption current of the servos is 2.265mA. The adaptor we selected for powering the servos gives us 2.250mA. The problem that occurs can be solved by moving one servo at a time. We achieve that in the programming. The communication current of each servo shown below:

HS485HB = 40mA

HS805BB = 40mA

HS755HB = 40mA

HS645MG = 40mA

HS225MG = 40mA

HS422 = 40mA
The powering current of IR sensor is 30mA and the communication current is 40mA. The sum of the communication currents of the servos plus the powering and communication currents of the sensor is 310mA. These currents powered by the USB cable (5V0.5A) of the computer through the ‘’LD29150DT50R current regulator IC (5V1.5A)” of the microcontroller. The regulator gives us a maximum of 1.5A current, which means that we can overlap the needs of 310mA (see [1],[13],[15], [9]).
5 Programming the Robotic Arm
In this section we provide the programming of the robotic arm. The software that have been used is the Arduino IDE (Integrated Development Environment). The programming language that he software supports are C and (see [19],[10]). Arduino IDE supplies programmers with software libraries which provides them with many input or output procedures. A library, for example, can be loaded into the program by writing #include math.h to the command line of Arduino IDE, this library is for the identification of the mathematical equations during programming. After the analysis of previous studies (Mechanical/ElectricalElectronic Engineering) we came up with four programs that show us the cooperation of the armsensor and its further applications within contemporary industrial forms of production.
5.1 Control cases
5.1.1 Autonomous Operation No.1
During the first autonomous operation, the arm takes the initial start position of the program, shown in Figure 10, then the arm takes the measuring position, according to the study that we have performed for the IR sensor (Fig. 11). As the arm reaches the measuring position, the IR sensor starts to collect distance measurements between the ‘’eye” of the sensor and the sorting area, as shown in the red circle in Figure 11. The distance between the sensor and the empty sorting area is set to 13.8 cm approximately. If the measurement is lower than 13.8 cm the sensor is set to recognize that an object is placed to the sorting area and the arm picks it up and puts it into the bucket
The video for the first autonomous operation can be seen in this URL: www.youtube.com/watch?v=srE3x6y4jqU&feature=youtu.be
5.1.2 Autonomous Operation No.2
The initial start position and the measurement position is the same as the autonomous operation No.1. If the sensor measures between 13.8cm and =10cm, recognizes that the object is ‘’short” (predefined measurements given). If the measurement is greater than 10cm then the sensor recognizes that the object is ‘’tall”. The arm, then picks the object and places it to a predefined positioned bucket (Left for ‘’short” and Right for ‘’tall”).
The video for the second autonomous operation can be seen in this URL: www.youtube.com/watch?v=e8vaBb9g2A&feature=youtu.be
5.1.3 Autonomous Operation No.3
Third autonomous operation is almost the same as the autonomous operation No.2. It differs in the placement of said objects. ‘’Short” object is placed to the predefined area, shown in Figure 12 and ‘’tall” object is placed to the predefined area, shown in Figure 13
The video for the third autonomous operation can be seen in this URL: www.youtube.com/watch?v=rFqdlcLnQ08
5.1.4 Manual Operation
The fourth program is the manual operation of the arm performed by a user with the keyboard of a computer loaded with said program. Users have full control of the arm manipulating each servo independently. Additionally users can see measurements of the sensor at the display of Arduino IDE.
The video for the manual operation can be seen in this URL:
www.youtube.com/watch?v=oFLjFvMqPjs&feature=youtu.be ([18],[19],[20])
6 Conclusions
A robot arm has been designed to cooperate with an infrared sensor for the identification of different sized objects and sorting them to predefined positions. Also the manual operation of the arm through computer’s keyboard is presented. The study is based on the three main axes that Mechatronics consist:

Mechanical Engineering

ElectricalElectronic Engineering

Computer Science
In the case of mechanical engineering, we analyze the torque of servos, degrees of freedom and work space of the arm, mathematical modeling of forward and inverse kinematics and the CAD of the arm. Analysis of electricalelectronic engineering was important for the required powering of the devices (BotBoarduino, Servos, IR Sensor), for communication between devices and for the maximum efficiency of devices. Three promising experiments have been conducted concerning the use of autonomous operations and the manual operation of the arm , that can be applied to the industry, as well as to other working environments. The IR sensor can identify a variety of objects based on their height and to validate position and orientation information of the grasped object. The designed robotic arm might be an educational one, but the procedure and the methodology followed is similar for an industrial type of robotic arm. The next steps comprises, kinematic update to the arm, object reorientation routine, Dynamics and kinematics of the object to improve stable grasping.
References
 [1] ”Arduino,” 2005. [Online]. Available: www.arduino.cc. [Accessed 2016].
 [2] www.lynxmtion.com/images/html/build185.htm
 [3] ”Servo Motor Guide” http://www.anaheimautomation.com/manuals/forms/servomotorguide.phpsthash.kpgbxS4k.dpbs
 [4] BALDOR ELECTRIC COMPANY, SERVO CONTROL FACTS http://www.baldor.com.
 [5] Barakat, A. N., Gouda, K. A. and Bozed, K. A., Kinematics analysis and simulation of a robotic arm using MATLAB, 2016 4th International Conference on Control Engineering Information Technology (CEIT), IEEE, 2016, pp. 15
 [6] Benson, C. Robot Arm Torque Tutorial http://www.robotshop.com/blog/en/robotarmtorquetutorial 7152, 2016
 [7] Corke P., Robotics, vision and control. Fundamental algorithms in MATLAB. Berlin: Springer, 2011.
 [8] Corke, P., ”A robotics toolbox for MATLAB”, IEEE Robot. Automat. Mag. Vol. 3(1), pp. 2432.
 [9] F. Chips, FT232R Datasheet , Future Technology Devices International Ltd., 2015.
 [10] B. Eckel, Thinking in C++, Volume 1, 2nd Edition, Upper Saddle River, New Jersey 07458 : Prentice Hall , January 13, 2000
 [11] Behrouz A. Forouzan, Data Communications And Networking Second Edition,, Higher Education, 2000.
 [12] Graig J (2017), ”Introduction to Robotics: Mechanics and Control (4th Edition)” Pearson.
 [13] D. Hart, Power Electronics, McGrawHill, New York, NY 10020, 2011.
 [14] Khanna, P., Singh, K., Bhurchandi, K. M., and Chiddarwar, S. Design analysis and development of low cost underactuated robotic hand. In 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO) (Dec 2016), pp. 20022007.
 [15] LD29150 Datasheet, STMicroelectronics, 2013.
 [16] Lee, D. H., Park, J. H., Park, S. W., Baeg, M. H., and Bae, J. H. Kitechhand: A highly dexterous and modularized robotic hand. IEEEASME Transactions on Mechatronics 22, 2 (April 2017), 876887.
 [17] R. A. Serway, Physics for Scientists and Engineers. 6th Ed., Brooks Cole, 2003.
 [18] Sharp Corporation. GP2Y0A21YK0F Datasheet, 2006
 [19] Souli J., C++ Language Tutorial, cplusplus.com, 2007
 [20] Mark W. Spong, Robot Modeling and Control 1st Edition, John Wiley and Sons, 2005.
Comments
There are no comments yet.