With more than 2 billion users worldwide, Facebook has been making tough calls when it comes to policing hate speech, harassment, nudity and violence online.

Some digital rights groups want it to provide more details about how it decides what to delete or keep.

"I think that because you have greater power, you have greater responsibility. And with that responsibility, sometimes you're going to have to bring out a little bit more transparency," Eva Galperin, Electronic Frontier Foundation's director of cybersecurity, told a Facebook executive at a panel discussion in San Francisco.

EFF found out through that Facebook's online rules are not enforced evenly, she said. When people get locked out of their accounts it can interrupt their work or daily lives, especially because Facebook is linked to other apps.

Some activists have accused the company's content moderators before of punishing minority users.

Alex Stamos, Facebook's chief security officer, said mistakes are bound to happen but are also rare. Some posts might mistakenly get flagged when a user speaks out against hate .

"If you turn up that dial of trying to prevent you will also turn up the dial of false positives," he said.

Stamos estimated that the number of accounts that Facebook shuts down per day is "at least seven or eight figures" based on the number of spam and fraudulent accounts that are created daily.

In the past, public criticism has led Facebook to change how it enforces some of its online rules. The tech firm apologized last year after it pulled down a famous Vietnam War photo that depicted a naked girl fleeing a napalm attack.

But Stamos questioned whether divulging more details about how Facebook enforces its rules would help.

"I'm not sure if we're in a media environment honestly where a lot of transparency in this area is going to end up with people being better off," he said.