I've read a history of Japan and can not think of a single instance where either group would have had influence.
Most of the changes came when Commodore Perry forced the harbours open and Japan started adapting Western influences as the latest trend. From then until the end of WWII, Japan was controlled almost exclusively by its military. After WWII, the country was busy trying to modernize and catch up with the rest of the world under the watchful eye of the American Army.
Various groups had influence inside Japan throughout its history, and many of them set their sights on expanding the empire outwards, but for a large portion of time Japan was a closed country, so it is highly unlikely that either of the European-based groups had an effect on Japanese history.