There is something nearly poetic about the fact that Facebook founder and CEO Mark Zuckerberg all but renounced his past views on data privacy on Ash Wednesday, the start of a 40-day period of atonement in the Christian liturgical calendar.

Although the timing was undoubtedly coincidental, this was the infamous social media giant’s most telling renunciation of past sins. Like a rambling late-night call from a desperate ex, Zuckerberg’s 3,000-word treatise, posted on Facebook, practically begged users to trust him and his site once again: We know you’ve been hurt, but that was the old Facebook; we’ve changed and matured so much; you need to give us another chance.

 Count us among the skeptics, but let’s give Zuckerberg’s proposal a fair airing.

The blog post at the center of it all, “A Privacy-Focused Vision for Social Networking,” comes on the heels of announced plans to unify Instagram DMs, Facebook Messenger, and WhatsApp to create a unified and integrated messaging service.

“As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms,” Zuckerberg wrote. “Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.”

As Zuckerberg sees the big picture of social media: “private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication.”

If Zuckerberg is now a true believer in data privacy, why did Facebook aggressively lobby to weaken the California Consumer Privacy Act and work hard to shape nearly inevitable federal legislation into something more agreeable to its own interests?

“There are a number of reasons for this,” he wrote. “Many people prefer the intimacy of communicating one-on-one or with just a few friends. People are more cautious of having a permanent record of what they’ve shared. And we all expect to be able to do things like payments privately and securely. … With all the ways people also want to interact privately, there’s also an opportunity to build a simpler platform that’s focused on privacy first.”

The first of Zuckerberg’s mea culpas and prognostications in the post: “I understand that many people don’t think Facebook can, or would, even want to build this kind of privacy-focused platform—because frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing.”

“I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever,” he added. “This is the future I hope we will help bring about.”

Zuckerberg plans to follow the blueprint established by developing and refining WhatsApp since its $19 billion acquisition in 2014: “focus on the most fundamental and private use case—messaging—make it as secure as possible, and then build more ways for people to interact on top of that, including calls, video chats, groups, stories, businesses, payments, commerce, and ultimately a platform for many other kinds of private services.”

The key to everything Facebook wants to do is end-to-end encryption. “Encryption is decentralizing—it limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information,” Zuckerberg wrote. “This is why encryption is an increasingly important part of our online lives, from banking to healthcare services. It’s also why we built end-to-end encryption into WhatsApp after we acquired it. On balance, I believe working towards implementing end-to-end encryption for all private communications is the right thing to do.”

Zuckerberg goes on to explain that a key challenge in building social tools is the “permanence problem.”

“As we build up large collections of messages and photos over time, they can become a liability as well as an asset,” he wrote. “Many people who have been on Facebook for a long time have photos from when they were younger that could be embarrassing. But people also really love keeping a record of their lives. … I believe there’s an opportunity to set a new standard for private communication platforms—where content automatically expires or is archived over time.”

Another roadblock is the problem of maintaining secure data storage

“People want to know their data is stored securely in places they trust,” Zuckerberg wrote. “Looking at the future of the internet and privacy, I believe one of the most important decisions we’ll make is where we’ll build data centers and store people’s sensitive data. There’s an important difference between providing a service in a country and storing people’s data there. As we build our infrastructure around the world, we’ve chosen not to build data centers in countries that have a track record of violating human rights like privacy or freedom of expression. If we build data centers and store sensitive data in these countries, rather than just caching non-sensitive data, it could make it easier for those governments to take people’s information.”

He admitted that upholding this principle “may mean that our services will get blocked in some countries, or that we won’t be able to enter others anytime soon. That’s a tradeoff we’re willing to make.”

Zuckerberg concluded: “I believe we should be working towards a world where people can speak privately and live freely knowing that their information will only be seen by who they want to see it and won’t all stick around forever. If we can help move the world in this direction, I will be proud of the difference we’ve made.”

We now return to our regularly scheduled skepticism.

Zuckerberg talks a good game, but it is hard to fully embrace his newfound role as a privacy evangelist.

Google any combination of “Facebook” and “data privacy” and the seemingly infinite number of hits—from Beacon to Cambridge Analytica—won’t do much in the way of convincing you. Neither does Zuckerberg’s youthful and notorious comments on data privacy. Responding to a query about how Facemash, a Facebook forerunner from his days at Harvard University, convinced so many students to upload photos and personal data, he infamously replied: “I don’t know why. They trust me. Dumb f**ks.” Less obscene and more recent commentary have suggested that we live in a post-privacy world where control of what gets shared, and to whom, is more desired than privacy protections.

Nevertheless, there are a lot of great ideas in the blog post. Integrating and unifying messaging platforms, not warmly embraced when it was announced, could eventually be a boon to users and set the stage for a true payment system, something Facebook has long sought.

Big questions, however, demand answers. How, exactly, will Facebook make money if it diminishes data mining and the newsfeed advertising it fuels?

Will we actually see some of these strategic changes any time soon, given that many past Facebook promises (anonymous logins, a “clear history” feature) either took forever-and-a-day to materialize, if they ever did at all?

How much of these grandiose plans are just a dog-and-pony show to appease regulators?

If Zuckerberg is now a true believer in data privacy, why did Facebook aggressively lobby to weaken the California Consumer Privacy Act and work hard to shape nearly inevitable federal legislation into something more agreeable to its own interests?

To what degree are these moves a response to routine attacks by EU officials and potentially costly and restrictive U.S. regulation and enforcement, not least of which is the Federal Trade Commission gearing up to levy a multibillion-dollar fine for violating a past, data privacy-related consent decree? Is bundling the messaging capabilities of Facebook, WhatsApp, and Instagram a strategy intended to complicate and deter any forthcoming regulatory effort to split up the company’s business lines?

Will any of this be enough to convince regulators of the merits of self-regulation, perhaps easing up on harsher rules in the process?

The most fundamental question of all: Can a company that has spent years chipping away at the data protections of its user base truly be reborn as a privacy champion?