
In skin care it means when a company claims that their products are natural or organic when they're not.
The main problem with Greenwashing is that consumers will lose their faith in natural brands. I've heard from lots of people that they bought a "natural" product and it wasn't good for them. Usually it happens because they thought the product was natural, but it wasn't and the harmful chemicals had a bad effect on their skin.
Don't be fooled by false advertising.
Natural lifestyle became very popular lately (which is good!), but companies take an advantage of it. Also, they (companies) know exactly that consumers will not check the ingredient list, because they believe their claims. Thus, saying about a product that it's natural, it's a big business.
When I went shopping I used to check the beauty section. Every single time when I found a product that said it's natural or organic I was disappointed. Although the bottle was perfect with those natural images, the ingredients weren't even close to being natural.