Remove Money Remove Structure Remove Taxonomy
article thumbnail

Crisis, Climate And Conflict Drove Disaster Philanthropy

The NonProfit Times

That sounds like a lot of money. New Data from the Center for Disaster Philanthropy (CDP) while it is not record-level funding, it was the third highest amount since the nonprofit started reporting such data in 2014. Overall giving to disasters worldwide totaled $1.7 billion, according to CDP data. However, it is just 1.4% of the $126.7

article thumbnail

Harmonic helps investors query the startup searches of their wildest dreams

TechCrunch

We go out and look at every nook and cranny of the web where there might be information about companies and we take that structured and unstructured data and figure out how to merge it all together into some canonical representation of a company,” Ruderman told TechCrunch.

Search 92
professionals

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Marketing data management platform Claravine nabs $16M

TechCrunch

CEO Verl Allen says that the new money, which brings the company’s total raised to $27.9 Toward this end, Claravine provides a dashboard where companies can build taxonomies using descriptions, lists, values, and referenceable fields. ” Claravine’s data management platform.

article thumbnail

A Conversation with Michael Gilbert on Nonprofit Blogging

Beth's Blog: How Nonprofits Can Use Social Media

The contemporary organization is the wrong model for civil society, the corporate structure in particular is dysfunctional, and human beings fit more naturally and are more empowered in communities, movements, and networks. And I know I would have to raise money for such a project and I'm weary of trying to convince funders of anything.

article thumbnail

AI for AI safety

The AI Alignment Forum

Eating free energy : More generally, non-rogue AIs operating and competing in the economy at large makes it harder for rogue AIs to easily gain outsized amounts of money and influence. Indeed, some strategies in the vicinity of AI for AI safety have roughly the following structure: [31] Step 1 : Get to/create the AI for AI safety sweet spot.

article thumbnail

Prioritizing threats for AI control

The AI Alignment Forum

terrorism, hacking of the AI company from the outside without privileged access, making money and then funding problematic stuff), and generally try to steer the world toward a state where AI takeover is more likely. For example, AI agents might: try to persuade or influence people to behave differently (e.g.