Uncalibrated visual servoing (VS) can improve robot performance without needing camera and robot parameters.Multiple cameras improve uncalibrated VS precision, but no works exist simultaneously using more than two cameras.The first data for uncalibrated VS simultaneously using more than two cameras are presented.VS performance is also compared for two different camera models: a high-cost camera and a low-cost camera, the difference being image noise magnitude and focal length.A Kalman filter based control law for uncalibrated VS is introduced and shown to be stable under the assumptions that robot joint level servo control can reach commanded joint offsets and that the servoing path goes through at least one full column rank robot configuration.Adaptive filtering by a covariance matching technique is applied to achieve automatic camera weighting, prioritizing the best available data.A decentralized sensor fusion architecture is utilized to assure continuous servoing with camera occlusion.The decentralized adaptive Kalman filter (DAKF) control law is compared to a classical method, Gauss-Newton, via simulation and experimentation.Numerical results show that DAKF can improve average tracking error for moving targets and convergence time to static targets.DAKF reduces system sensitivity to noise and poor camera placement, yielding smaller outliers than Gauss-Newton.The DAKF system improves visual servoing performance, simplicity, and reliability.