Connected Toys Expose Smart Homes: Report
Security experts have warned of several flaws in connected toys which could allow hackers to talk to the children using them or even launch attacks against the smart home.
British consumer advice group Which? enlisted the help of pen testing firm NCC Group to run the rule over seven smart toys from major retailers Amazon, Smyths, Argos and John Lewis.
Several, including the Singing Machine SMK250PP and TENVA’s pink karaoke microphone, don’t require session-based authentication for their Bluetooth connection. This could allow hackers to anonymously pair with and stream audio into them — potentially offensive or even “manipulative" messages exhorting the child using the device to go outside, NCC Group claimed.
A similar issue existed in KidiGear walkie talkies from Vtech.
“A pair of walkie talkies investigated as part of this security assessment allowed for children to communicate with each other, within a range of up to 150 meters. There was no mutual authentication between the pairs of walkie talkie devices,” NCC Group continued.
“This means that if an attacker purchased the same set of toys and was in range of an unpaired, powered-on walkie talkie, they would be able to successfully pair with it and engage in a two-way conversation with the child user under certain conditions.”
However, the chances of this happening are pretty slim, according to Vtech.
“The pairing of KidiGear Walkie Talkies cannot be initiated by a single device. Both devices have to start pairing at the same time within a short 30 second window in order to connect,” it clarified. Once paired, a handset cannot be paired with a third device owned by a stranger.
NCC Group also uncovered potential problems with the karaoke toys, which it said could be used to launch “second-order IoT attacks.”
“With the two karaoke toys investigated and their unauthenticated Bluetooth implementations, it was possible to connect to them when in range and issue digital assistant voice activation commands,” it said.
“While different smart home configurations will exist, it is not inconceivable that some homes might have digital assistants configured to open smart locks on front doors, for example. One can thus imagine an attacker outside of a property, connecting without authentication to a Bluetooth toy to stream audio commands to enact a second-order objective, such as ‘Alexa, unlock the front door’.”
A similar attack could enable hackers to order goods from the victim household’s Amazon account and intercept them, claimed Which?
“Smart toys are one of the key areas identified by the government’s drive to make connected products ‘secure by design’,” the group said. “We’re calling on the toys industry to ensure that unsecure products like the ones we’ve identified are either modified, or ideally made secure before being sold in the UK.” Source: Information Security Magazine