Proposal: Strategic AWS Partnership with Dinis Cruz’s GenAI and Graph Innovations
by Dinis Cruz and ChatGPT Deep Research, 2025/06/03
Download LinkedIn Post Back to Projects
Executive Summary¶
Dinis Cruz is a seasoned cybersecurity technologist and researcher who has spent over six years building advanced solutions on Amazon Web Services (AWS). He has a proven track record of leveraging AWS’s serverless architecture and Generative AI (GenAI) services to create innovative tools in areas like cloud automation and knowledge graphs. Notably, Cruz spearheaded the development of serverless GenAI and graph-based applications – including the open-source MGraph-AI graph database and the OSBot-AWS automation toolkit – which showcase the power of AWS Lambda, Amazon S3, and related services in real-world use. These projects fill critical gaps in the AWS ecosystem, demonstrating capabilities that AWS’s own offerings do not yet natively provide. For example, Cruz identified the absence of a truly serverless graph database in AWS’s lineup; existing graph databases were too heavy or impractical for on-demand, Lambda-based use. In response, he created MGraph-AI, a memory-first graph database optimized for AI workloads and designed for serverless deployment (zero cost when idle). This innovation presents a timely opportunity for AWS to partner with Cruz to bridge a strategic gap.
By collaborating with Dinis Cruz, AWS can co-develop and invest in these cutting-edge solutions, reaping mutual benefits. His flagship projects – MGraph-AI, OSBot-AWS, MyFeeds.ai, and The Cyber Boardroom – are built on AWS and align closely with AWS’s priorities in serverless computing and GenAI (including plans to integrate with Amazon Bedrock). AWS’s sponsorship and support could accelerate the evolution of a serverless graph database service and highlight AWS’s cloud as the premier platform for GenAI-powered applications. We propose that AWS engage with Cruz through a strategic partnership that includes co-development efforts, showcasing his work as case studies for GenAI and serverless best practices, and sponsoring ongoing open-source development. This document provides an overview of Dinis Cruz’s background, a timeline of his AWS-centric projects, technical context for each innovation, and specific collaboration opportunities that would offer high strategic value for AWS.
Background: Dinis Cruz’s AWS-Centric Experience and Vision¶
Dinis Cruz is a cybersecurity expert and software innovator with roles including Founder of The Cyber Boardroom and Chief Scientist at Glasswall. Over his career, he has cultivated deep expertise in cloud development, focusing particularly on AWS technologies. Since around 2018, Cruz has extensively utilized AWS services to build applications and automation frameworks in support of security and AI initiatives. His background in application security (as a former OWASP leader and virtual CISO/CTO) informs the design of his solutions – which emphasize secure, scalable architectures on AWS.
Crucially, Cruz has been an early adopter and advocate of serverless computing on AWS. He recognized the agility and cost-efficiency of services like AWS Lambda, Amazon API Gateway, and Amazon S3 for building flexible workflows. Over the past six+ years, he applied these services across numerous projects, gaining hands-on proficiency with AWS Lambda (for on-demand computation), Amazon S3 (as a persistent storage and static hosting layer), Amazon CloudFront (for global content delivery), AWS CloudFormation/IaC (to automate infrastructure provisioning), and AWS IAM (to enforce fine-grained security for cloud resources). This AWS-centric development experience is evident in the projects highlighted below, which collectively span cloud automation bots, GenAI-driven applications, and novel database technologies – all rooted in AWS’s cloud ecosystem.
Timeline of Key Projects and Research (AWS-Based)¶
-
2019 – OWASP Security Bot (OSBot) on AWS: Cruz developed the OWASP Security Bot, known as OSBot, to automate security tasks by integrating various tools (Jira, Slack, Elastic, etc.) on a serverless AWS stack. In a 2019 presentation, he demonstrated how OSBot uses AWS-backed serverless workflows to aggregate data (e.g. Jira issues) and generate knowledge graphs for security decision-making. This project laid the groundwork for treating cloud data sources as an interactive graph database (“Jira as a Graph DB”) and showcased AWS as a viable platform for on-demand data analysis.
-
2018–2021 – Expansion of OSBot and AWS Tooling: Over the next few years, Cruz expanded OSBot into a comprehensive toolkit for AWS automation. He created OSBot-AWS, an open-source Python library bundling a “large number of AWS APIs and utils” to simplify use of the boto3 SDK. This toolkit encapsulated common tasks (across S3, EC2, CloudFormation, IAM, etc.), making cloud infrastructure easier to deploy and manage programmatically. By version 1.0 in this period, OSBot-AWS was enabling intuitive automation of AWS resources, and it served as the backbone for various security workflows (including pipelines that connected Jira and other systems into AWS services). These efforts reflected Cruz’s deepening AWS expertise and set the stage for integrating AI capabilities.
-
2022 – GenAI Integration and Knowledge Graph R\&D: With the emergence of advanced Generative AI (e.g. GPT-¾), Cruz began merging LLM (Large Language Model) capabilities into his AWS solutions. He experimented with using GenAI to enhance cloud security operations and data analysis. Notably, he started developing graph-based approaches to feed context into AI models – an area that would become central to his later projects. Around this time, the concept for a specialized graph database to support AI (which later became MGraph-AI) took shape, driven by the need to efficiently connect facts and context for AI reasoning. This R\&D phase laid the groundwork for a memory-first, AI-friendly graph store, leveraging AWS’s storage (for persistence) and compute services.
-
Early 2024 – Launch of The Cyber Boardroom (Serverless GenAI Platform): In April 2024, Dinis Cruz formally launched The Cyber Boardroom as a startup and platform, underscoring his commitment to AWS and GenAI. The Cyber Boardroom is a cloud-native, serverless application on AWS aimed at helping executives and boards with cybersecurity decision-making. It is “driven by two key technological paradigms: 1) Serverless architecture (currently on AWS) and 2) GenAI bots/agents”. Cruz’s team built the platform using AWS Lambda, API Gateway, and AWS storage services to host a conversational AI assistant named Athena. Athena is a Python-based GenAI cyber advisor running serverlessly, initially powered by OpenAI’s GPT-4 and slated to integrate with Amazon Bedrock to use AWS-native LLMs. This marked a milestone where Cruz’s AWS know-how and GenAI research converged into a live product, reinforcing strategic alignment with AWS’s focus on AI and serverless tech.
-
Late 2024 – Serverless News Feed Architecture (Cyber Boardroom): To enrich the Boardroom platform, Cruz engineered a custom news feed processing pipeline on AWS. In December 2024, he detailed an architecture that collects cybersecurity RSS feeds, processes them into JSON, and serves personalized insights via two serverless APIs. The system uses Amazon S3 (with CloudFront) as one API for direct, high-speed data access, and a FastAPI service on AWS Lambda as a second API for dynamic queries. This design cleverly combines AWS services: S3 for its scalability and speed, and Lambda for added functionality, with both exposed through secure endpoints. The result is a highly performant, easily maintainable solution that leverages serverless computing plus GenAI in tandem. This project not only demonstrates Cruz’s ability to optimize AWS services for low-latency data delivery (~30ms via S3, ~150ms via Lambda) but also serves as a precursor to the MyFeeds.ai concept.
-
Early 2025 – Introduction of MGraph-AI (Serverless Graph Database): In January 2025, Cruz released MGraph-AI, a breakthrough open-source graph database designed specifically for AI and serverless environments. After facing challenges with existing graph databases that “did not support serverless” and were too complex or costly to run for his use cases, he developed MGraph-AI to fill that gap. MGraph-AI keeps data in-memory for speed and persists to JSON files on disk (or cloud storage) for durability, enabling lightning-fast graph operations suitable for AWS Lambda and other ephemeral runtimes. Crucially, it incurs zero cost when not in use (aligning with serverless cost models) and supports Generative AI workloads like retrieval-augmented generation (Graph-RAG) with ease. This project’s release was a significant milestone in Cruz’s research evolution, encapsulating years of learning from OSBot and Cyber Boardroom. It immediately opened new possibilities for building on-demand knowledge graphs in AWS cloud workflows.
-
March 2025 – MyFeeds.ai MVP (Personalized Semantic News Feeds): Building on the news feed architecture and MGraph-AI, Cruz launched MyFeeds.ai as a proof-of-concept service for personalized news digestion. The MyFeeds.ai MVP demonstrated a multi-phase pipeline where raw news articles are ingested, transformed into a semantic knowledge graph (using MGraph-DB) and then summarized or personalized via LLMs. The system achieved provenance and deterministic behavior in an LLM-powered feed by breaking down the problem: using graphs to map entities and relationships from news, then guiding the LLM with this structured context. Like his other projects, MyFeeds.ai was implemented on AWS’s serverless stack (with AWS Lambda functions orchestrating the flows and S3 storing intermediate JSON and graph data). This project showcased how graph databases and GenAI can work together on AWS to deliver tailored intelligence, and further proved out the capabilities of MGraph-AI in a real-world scenario. By 2025, Cruz’s ecosystem of projects (OSBot-AWS, Cyber Boardroom, MGraph-AI, MyFeeds.ai) formed a cohesive narrative of innovating on AWS – from cloud infrastructure automation to AI-driven applications – all using serverless principles and graph data models.
Project Highlights and Technical Context¶
OSBot-AWS: Serverless Automation and “Jira as a Graph DB”¶
OSBot-AWS is a cornerstone of Dinis Cruz’s AWS work – a toolkit that evolved from the OWASP Security Bot project to streamline AWS operations. The OSBot-AWS library (open-sourced on PyPI) provides a “large number of AWS APIs and utils” that wrap around boto3, making AWS actions more intuitive for developers. In practice, OSBot-AWS has been used to automate tasks like deploying cloud infrastructure, managing security dashboards, and integrating third-party services (Slack, Jira, etc.) with AWS services. By abstracting common patterns (for example, launching Lambdas, handling S3 files, running CloudFormation stacks), OSBot-AWS enabled rapid development of cloud-aware bots and workflows.
One of the notable achievements under OSBot was the concept of using Jira as a Graph Database for security data – effectively turning issue and project data into a navigable knowledge graph. In 2019, Cruz demonstrated OSBot’s ability to “generate graphs of data from tools like Jira to help understand security issues and their relationships,” all within a serverless architecture. In this setup, Jira tickets (representing vulnerabilities, assets, etc.) were nodes in a graph, and relationships (links, references, common labels) formed the edges, giving a visual map of an organization’s security posture. OSBot orchestrated this by pulling data from Jira and other sources, storing it in AWS (possibly in Elasticsearch or simple JSON in S3), and then using AWS Lambda and Jupyter notebooks for analysis and graph visualization. The “serverless architecture with components like Jira, Elastic, Slack, and Jupyter” meant that no always-on servers were needed – the solution could scale on-demand via AWS services.
Overall, OSBot-AWS showcased Cruz’s leadership in cloud automation: it not only provided immediate utility for security workflows but also underscored how AWS services can be glued together for creative outcomes. The “Jira as a Graph DB” use-case was ahead of its time in leveraging graph concepts for IT data, and it foreshadowed the dedicated graph database (MGraph-AI) that Cruz would later create. For AWS, OSBot-AWS stands as a testament to developer ingenuity on the platform – it is an example of an individual building a custom automation layer over AWS to solve domain-specific problems. This toolset, maintained for over 6 years, has matured alongside AWS (current version 2.x in 2025) and continues to be foundational in Cruz’s projects.
MGraph-AI: Memory-First Serverless Graph Database for GenAI¶
MGraph-AI is a groundbreaking project that exemplifies innovation born directly from AWS cloud needs. It is an open-source graph database implemented in Python, purpose-built to support AI and serverless applications. Cruz developed MGraph-AI after experiencing limitations with traditional graph databases in cloud environments – “Traditional graph databases did not support serverless and were either too heavy, too complex to deploy, or just didn’t fit” his workflow. MGraph-AI addresses this by taking a “memory-first” approach with JSON serialization for persistence. In effect, the entire graph resides in memory during operation (delivering extremely fast reads/writes), and data is saved to simple JSON files on disk (or Amazon S3) for durability. This design makes it ideal for AWS Lambda or Fargate tasks where you want instant graph performance and persistence without running a full-time database server.
Key features of MGraph-AI include type-safe operations, a layered API (for data access, editing, filtering, storage), and minimal external dependencies – all of which make it well-suited for serverless deployments where package size and cold-start time matter. Importantly, MGraph-AI was created with AI/ML workloads in mind: it’s optimized for scenarios like semantic search, knowledge graphs, and retrieval-augmented generation (RAG) in LLMs. For example, one use case Cruz cites is “GenAI applications requiring fast graph support (like Graph RAG)”. By enabling graphs to be built and queried within a Lambda invocation (and incurring “zero cost when the DB is not being used”), MGraph-AI fills a niche that AWS’s current database lineup does not explicitly cover.
MGraph-AI’s impact is already evident in Cruz’s own applications. It provides the “critical part of the workflow” for MyFeeds.ai and similar projects by allowing easy manipulation and merging of JSON-based nodes and edges in memory. In other words, MGraph serves as the transient knowledge store that an LLM-powered app can consult for facts and context, all within a serverless context. The project is released under an open-source license (Apache 2.0) and actively encourages community contribution, reflecting Cruz’s philosophy of collaborative innovation.
For AWS, MGraph-AI represents a strategic technology: it demonstrates how a lean, cloud-native graph database can operate on AWS’s infrastructure. Currently, AWS offers Amazon Neptune for graph storage, including a “serverless” Neptune option, but even that requires managing clusters or Aurora-style scaling. By contrast, MGraph-AI shows an alternate model: truly on-demand graph data processing embedded in application code. This is directly aligned with serverless principles and could become a reference for AWS to either partner on or learn from in developing a first-party solution. In summary, MGraph-AI positions Cruz as a thought leader in cloud graph databases – and an ideal partner for AWS if it seeks to bolster its offerings in that area.
The Cyber Boardroom: Serverless GenAI Platform on AWS¶
The Cyber Boardroom is Dinis Cruz’s flagship GenAI application, and it’s built entirely on AWS’s serverless stack. The platform is essentially a personalized cyber-security advisor for executives, powered by large language models, and delivered through a web interface. From its inception, Cruz grounded The Cyber Boardroom in AWS technology choices to maximize scalability and minimize overhead. As he described, “The Cyber Boardroom is driven by two key technological paradigms: 1) Serverless architecture (currently on AWS) and 2) GenAI bots/agents.” This means every component – from the back-end logic to data storage – runs on managed AWS services that automatically handle scaling. For instance, the interactive AI assistant (Athena) is implemented as an AWS Lambda function (or a set of Lambda-backed APIs) that processes user queries with the help of an LLM. The website content (articles, videos, etc.) is likely served from Amazon S3 and CloudFront, while user interactions are tracked and handled by AWS services (API Gateway, DynamoDB or S3 for state, etc.). This architecture allowed Cruz to onboard 1,300 users from 40+ countries within months of launch with minimal infrastructure management.
A distinguishing feature of The Cyber Boardroom is its integration of Generative AI in a secure, enterprise-focused manner. Athena, the AI bot, was initially powered by OpenAI’s GPT-4, but Cruz strategically planned to adopt Amazon Bedrock to utilize AWS’s native GenAI offerings. By doing so, The Cyber Boardroom would leverage AWS-hosted foundation models (such as Amazon Titan or other Bedrock-partnered models) to ensure better data privacy, integration, and potentially cost benefits as usage grows. This forward-looking approach aligns the platform with AWS’s GenAI initiative, making it a potential showcase for Amazon Bedrock in a real-world application. In addition to Q\&A style interactions, the Boardroom also provides a content library and training materials for which serverless backends (Lambda functions generating content recommendations, etc.) are used.
From a technical standpoint, one of the sub-projects within The Cyber Boardroom was the News Feeds architecture discussed earlier (Dec 2024). That solution exemplified how Cruz uses AWS services in unison: AWS CloudFront and S3 served as a low-latency content API, while AWS Lambda (with FastAPI) provided dynamic functionalities. The outcome was a system that could fetch and update cybersecurity news, then allow Athena (the GenAI bot) to draw on that curated data when advising users. By combining data engineering and AI on AWS, The Cyber Boardroom offers an end-to-end serverless solution – from data ingestion to AI interaction – all on a single cloud platform.
For AWS executives, The Cyber Boardroom is a compelling case study candidate. It illustrates how a startup can build a complex, AI-driven service quickly by using AWS serverless components, and it addresses a high-impact domain (cybersecurity for corporate leaders). With further support, this platform could be officially showcased to demonstrate AWS’s strengths: imagine a public AWS case study or re:Invent presentation about how a vCISO built a next-gen advisory service using AWS Lambda, Amazon S3, and Amazon Bedrock. Such exposure would not only benefit Cruz’s venture but also highlight AWS’s capability to empower agile innovation in the AI era.
MyFeeds.ai: Personalized News Feeds via Graphs and AI (on AWS)¶
MyFeeds.ai is a spin-off project born from the Cyber Boardroom’s needs, which evolved into a standalone demonstration of personalised content delivery using GenAI and graph techniques. The premise of MyFeeds.ai is to provide users (e.g., CISOs or other professionals) with a tailored news digest, where each feed is generated through multiple stages of AI processing and data transformation. Technically, MyFeeds.ai leverages the AWS serverless stack similarly to the Boardroom platform. Ingesting RSS feeds is handled by scheduled AWS Lambda functions or container tasks that fetch new articles. These articles are then parsed and stored as structured JSON (using AWS storage). Next, the power of MGraph-AI comes into play: the JSON data (titles, summaries, entities extracted from articles) is loaded into an in-memory graph using MGraph, so that connections between topics, people, companies, etc., can be established. Cruz notes that MGraph-DB provided the ability to “easily manipulate and merge those JSON objects as nodes and edges” in the workflow. This step essentially creates a semantic knowledge graph of the day’s news.
Once the graph is built, Generative AI takes the stage. MyFeeds.ai uses a series of LLM prompts (through an API like OpenAI or potentially AWS Bedrock in the future) to generate a summary or a blog-style post that is personalized to the user’s interests. Notably, Cruz’s design emphasizes provenance and determinism – meaning that the content generated by the AI can be traced back to factual nodes in the graph, and the AI’s behavior is made more predictable by feeding it structured context. The output is then delivered to the user, likely via a simple web interface or email, with all processing happening behind the scenes on AWS services. Because of the serverless design, this system can scale to many users and run cheaply (each personalized feed triggers a series of Lambda invocations and API calls, and if no one requests a feed, no resources are consumed).
MyFeeds.ai demonstrates the synergy between graph databases and GenAI in a cloud environment. It validates that having an ephemeral, in-memory graph (MGraph) to curate context can significantly improve the quality of AI outputs (the LLM is less “hallucinating” and more grounded in the provided data). It also validates AWS as the platform to host such a pipeline – using Lambda for the heavy lifting in bursts, and S3/CloudFront to distribute results, for example. While MyFeeds.ai is currently an MVP, it holds commercial promise (e.g., as a subscription feature for CISOs, as hinted in Cruz’s posts). More importantly for AWS, it’s another proof-point that complex multi-step AI pipelines can be built entirely with serverless components. From data collection to AI generation, AWS services underpinned the solution. This project, alongside The Cyber Boardroom, reinforces the message that AWS is an ideal playground for innovative GenAI applications.
Strategic Alignment with AWS Services and Goals¶
Dinis Cruz’s work aligns remarkably well with several of AWS’s strategic technology focus areas:
-
Serverless Computing Leadership: All of Cruz’s projects embrace a serverless-first philosophy. By using AWS Lambda, Amazon API Gateway, Amazon S3, and related services, he has built systems that scale seamlessly and minimize operational maintenance. This directly showcases the value of AWS’s serverless offerings. For AWS, having real-world implementations like Cruz’s provides credible validation to other customers that even complex use cases (like graph databases or AI bots) can be handled with a serverless approach. It reinforces AWS’s messaging that serverless is not just for simple tasks but can power production-grade, innovative solutions. Cruz’s projects can thus be held up as exemplars of modern cloud architecture, encouraging more AWS adoption of Lambda, Step Functions, and other services.
-
Generative AI and Amazon Bedrock: AWS has invested heavily in Amazon Bedrock and related AI services to capture the GenAI trend. Cruz’s applications are precisely the kind of workloads AWS wants on its cloud – they use large language models to deliver specialized domain value (cybersecurity insights, personalized content). Notably, The Cyber Boardroom is already integrating with Amazon Bedrock to utilize AWS-hosted models, which positions it as an early adopter of that service. By partnering with Cruz, AWS can gain a compelling Bedrock success story in the cybersecurity domain. Moreover, Cruz’s focus on deterministic AI outputs and provenance (through graph backing) aligns with enterprise customers’ needs for trustworthy AI – something AWS can leverage as a narrative for Bedrock (“see how one founder built explainable AI on AWS”). This synergy means AWS’s GenAI initiatives would directly benefit from highlighting and supporting Cruz’s work.
-
Graph Database Technology Gaps: Currently, AWS’s primary graph database offering is Amazon Neptune. While Neptune is a fully managed service, it traditionally required provisioning instances and wasn’t designed for transient, on-demand usage. AWS did introduce Neptune Serverless, but even so, the concept is bounded by the architecture of a database cluster (albeit auto-scaling) rather than embedding graph logic into serverless functions. There remains a gap for developers who want a lightweight, embedded graph engine that can run within Lambda or Fargate and scale down to zero when not in use. This is exactly the gap MGraph-AI fills. By collaborating around MGraph-AI, AWS can explore offering a first-party “serverless graph” solution or toolkit. This could prevent customers from looking to other clouds or third-parties for graph needs in serverless apps. In essence, Cruz’s research gives AWS a head start on a potential new feature or pattern (much like AWS adopted community projects in the past to enhance their offerings). Embracing this opportunity keeps AWS at the cutting edge of cloud services (since competing clouds do not yet have an equivalent to MGraph-AI for serverless scenarios).
-
Open Source and Developer Engagement: All of Cruz’s projects are open source or have significant open components (OSBot-AWS, MGraph-AI, etc., are on GitHub under permissive licenses). AWS has a history of engaging with and sometimes sponsoring open-source projects that align with its services (for example, contributing to Kubernetes, adopting projects like Apache Airflow for MWAA, etc.). By supporting Cruz’s open-source work, AWS can foster goodwill in the developer community and drive deeper integration of AWS services into these projects. For instance, AWS could contribute enhancements to MGraph-AI (like connectors to Amazon S3 or Neptune compatibility modes), or it could help optimize OSBot-AWS for the latest AWS APIs. Such involvement would show that AWS is committed to community-driven innovation. Additionally, Cruz’s active blogging and community presence (e.g., LinkedIn articles, conference talks) means any AWS support will be visible to a wide audience of practitioners, indirectly marketing AWS’s strengths.
-
Use-Case Diversity (Security, Data Automation, AI): The breadth of Cruz’s projects covers multiple domains – cloud security automation (OSBot), enterprise governance (Cyber Boardroom), data engineering (news feeds), and AI/ML (LLM applications). This diversity is valuable to AWS because it touches various customer segments. Partnering with Cruz can yield case studies or reference architectures in each of these areas:
-
Security Automation: Using AWS to integrate tools like Security Hub, Jira, Slack for real-time risk management (OSBot use case).
- Executive Enablement: Using AWS AI services to bridge the gap between technical cybersecurity data and executive decision-making (Cyber Boardroom use case).
- Content Personalization: Building personalized content feeds with AWS serverless backends and AI (MyFeeds.ai use case).
- Data Analytics with Graphs: Employing graph models on AWS for advanced analytics and context management (MGraph-AI use case). This showcases AWS as a versatile platform across different problem domains, all unified by serverless and AI capabilities.
In summary, Dinis Cruz’s body of work is strategically aligned with AWS on multiple fronts – it leverages and extends AWS’s core services, it anticipates customer needs (serverless graph processing, explainable AI) that AWS is interested in, and it is delivered in a way (open-source, case-study-ready) that AWS can readily utilize for mutual benefit.
Proposed Collaboration Opportunities and Recommendations¶
To capitalize on the synergy outlined above, we recommend that AWS pursue a collaboration and investment partnership with Dinis Cruz across the following areas:
-
Co-develop a Serverless Graph Database Offering: Partner with Cruz to bring a serverless graph database service or reference architecture to AWS customers. This could involve AWS sponsoring the development of MGraph-AI into an AWS-compatible managed library or service. For example, AWS and Cruz’s team could co-author an AWS Solutions Implementation or an AWS Quick Start that packages MGraph-AI for easy deployment on Lambda with S3 as backing storage. In the longer term, AWS might integrate the concepts of MGraph-AI into a first-party service (filling the gap between DynamoDB/Neptune and the needs of serverless apps). By co-developing this, AWS gains a novel service offering, and Cruz’s design expertise ensures it meets real developer requirements. Success Metric: Within 12 months, a publicly announced AWS reference architecture or beta service for “Serverless Graph DB on AWS” is released, with MGraph-AI as its core and Dinis Cruz as a design partner – positioning AWS as a pioneer in this space.
-
Feature Cruz’s Work as AWS Case Studies and Thought Leadership: Leverage the existing success of Cruz’s projects to create compelling case studies, conference talks, and blog content under the AWS banner. AWS could produce written case studies on The Cyber Boardroom and MyFeeds.ai, highlighting how they were built on AWS and the business outcomes achieved. These stories could be featured on the AWS Machine Learning Blog, AWS Startups Blog, or presented at events like re:Invent or AWS Summits (with Cruz co-presenting alongside an AWS representative). The focus would be on how AWS services (Lambda, Bedrock, etc.) enabled a small team to rapidly innovate in cybersecurity and AI. Additionally, AWS can invite Cruz to share best practices in webinars or Community Builder talks, solidifying AWS’s reputation as the go-to platform for GenAI innovation. Success Metric: Publication of 2–3 AWS official case studies or blog posts in the next year, and speaking slots for Cruz at major AWS events, thereby increasing customer awareness of AWS’s capabilities through these success stories.
-
Sponsorship and Support for Open-Source Integration: Provide funding and resources to accelerate the development of Cruz’s open-source projects and their integration with AWS services. This can take the form of research grants or innovation awards, AWS Activate credits for his startup, or even a formal sponsorship where AWS funds specific features. For instance, AWS could sponsor enhancements to OSBot-AWS to cover new AWS service APIs (ensuring the toolkit stays up-to-date and robust), or fund the creation of integration modules that connect MGraph-AI with Amazon Neptune or Amazon SageMaker. AWS solution architects might also work with Cruz to incorporate his tools into AWS Well-Architected patterns for serverless and AI. Crucially, Cruz’s projects are Apache-2.0 licensed and open to contributors, which means AWS engineers could directly contribute code or guidance. Such support would not only improve the tools for all users but also signal AWS’s commitment to developer-driven innovation. Success Metric: Measurable advancements in the OSBot-AWS and MGraph-AI projects within 6–12 months (e.g., new releases with AWS-optimized features, better AWS documentation for them), coupled with acknowledgments of AWS’s support in those project communities.
-
Joint Solution Offerings and Go-to-Market Initiatives: AWS and Dinis Cruz could collaborate on packaging his solutions for wider adoption. For example, AWS could help turn The Cyber Boardroom’s architecture into an AWS Solution that other enterprises can deploy (with a reference stack on AWS Quick Start). Similarly, an AWS Marketplace offering could be created for an “Athena-like” cyber boardroom bot or for MyFeeds-style personalized feed generator, making it easy for AWS customers to spin up similar capabilities. By doing a joint go-to-market, AWS can showcase these offerings to enterprise clients (especially in financial, healthcare, and other sectors where board-level cyber advisory and personalized threat intel are in demand). In return, Cruz’s innovations gain a pathway to scale commercially under AWS’s wing. Success Metric: At least one joint solution or marketplace offering launched, and pilot engagements with key AWS customers (perhaps as design partners) who adopt these solutions in their environments.
-
Advisory Role and Continuous Feedback Loop: Bring Cruz into AWS’s network as an advisor or community hero (if not already). AWS could formalize a relationship where Cruz provides feedback on AWS’s GenAI and serverless roadmap from an independent innovator’s perspective. For instance, he could join the AWS Community Builders or be an AWS Data Hero, which would amplify his voice and let AWS glean insights from his ground-level experimentation. His unique intersection of security and AI knowledge can guide AWS in developing features that appeal to that intersection (for example, AI services with better security controls, or security services with AI integrations). In addition, AWS could involve him in beta programs for new services (ensuring that when AWS rolls out new AI or database features, they are vetted against real use cases like his). Success Metric: Ongoing engagement established (quarterly calls or feedback sessions), and tangible suggestions from Cruz reflected in AWS service improvements or new feature releases over time.
By pursuing these collaboration avenues, AWS stands to gain a high-impact partnership that not only elevates its product offerings but also demonstrates its commitment to supporting innovators. Dinis Cruz’s projects would, in turn, reach their full potential with AWS’s backing – a win-win scenario that can be publicized and celebrated. We recommend AWS leadership to designate a sponsorship and technical liaison team to begin scoping these opportunities in detail, and to engage Dinis Cruz in discussions about formalizing this partnership.
Conclusion¶
Dinis Cruz’s work epitomizes the innovative spirit and technical excellence that AWS strives to enable in its customer community. His extensive experience building on AWS – from automating cloud infrastructure with OSBot-AWS to creating GenAI-driven platforms like The Cyber Boardroom – showcases the art of the possible on AWS’s cloud. He has identified and solved real-world problems (like the need for a serverless graph database) in ways that align perfectly with emerging market needs and AWS’s strategic direction.
By investing in a partnership with Cruz, AWS can fast-track the development of unique services (such as a serverless graph database), enrich its GenAI portfolio through authentic use cases, and inspire other customers through compelling success stories. This proposal has outlined how such a partnership could be structured, including co-development, case studies, open-source sponsorship, and joint offerings – all aimed at mutual strategic gain. AWS has the opportunity to elevate Cruz’s research into a flagship showcase of cloud innovation, while simultaneously strengthening its own platform with the insights and technologies born from that research.
In a competitive cloud market where enterprises are seeking advanced AI and data capabilities, embracing grassroots innovations like those of Dinis Cruz can differentiate AWS. It sends a message that AWS not only provides the tools but also actively collaborates with pioneers to push boundaries. We recommend AWS executives move forward to engage Dinis Cruz in crafting a partnership agreement. With AWS’s support, his next breakthroughs could become built-in AWS advantages. By acting on this proposal, AWS can turn an individual innovator’s achievements into a broader strategic win, reinforcing AWS’s reputation as the cloud for cutting-edge innovation and fostering a new success story that will resonate across both technology and business leadership circles.
Sources: Dinis Cruz’s project publications and posts (LinkedIn articles, PyPI project descriptions, and related materials). These sources detail the technical feats and context of the mentioned projects, corroborating the claims and opportunities discussed above. Each citation points to the specific statements by Cruz that underpin this proposal’s analysis and recommendations.