Connect with us

Latest News

Goldsky Integrates IOTA EVM, Enhancing Data Accessibility and Real-Time Analytics | IDOs News

Avatar

Published

on

Goldsky Integrates IOTA EVM, Enhancing Data Accessibility and Real-Time Analytics | IDOs News
Goldsky Integrates IOTA EVM, Enhancing Data Accessibility and Real-Time Analytics | IDOs News







Goldsky has announced its successful integration with IOTA’s Ethereum Virtual Machine (EVM), a move set to revolutionize data accessibility and real-time analytics for developers working within the IOTA ecosystem, according to the IOTA Foundation Blog.

Revolutionizing Data Indexing for the IOTA Developer Ecosystem

The integration of Goldsky with IOTA EVM simplifies access to data, enabling real-time analytics and powering applications without the need for manual subgraph-based indexing infrastructure. This advancement allows developers to focus on building applications rather than managing complex blockchain data infrastructure. Goldsky’s product suite enhances data accessibility, enabling efficient and cost-effective development on IOTA EVM.

Fast and accurate data is crucial for the success of any decentralized application. Managing data across transactions, addresses, tokens, blocks, and smart contracts can be resource-intensive. Many existing solutions fall short, offering pre-made endpoints that lack flexibility, hosted services with low reliability, and complex products that take weeks to set up.

Goldsky stands out as a leading indexer in the crypto space, empowering thousands of crypto businesses to build rich, instant, data-driven experiences. With its recent integration into IOTA EVM, Goldsky aims to make data access and real-time analytics much easier for developers, ensuring exceptional user experiences in the IOTA ecosystem.

Delivering Data Analysis: Goldsky’s Product Suite

Goldsky’s holistic product suite provides unparalleled data indexing capabilities, enabling developers to seamlessly access, manage, and analyze blockchain data with precision and efficiency. By eliminating the need for developers to run their own data servers, build indexing infrastructure, or manually parse data, Goldsky reduces costs and ensures continuous data availability, allowing developers to focus on their projects.

For IOTA EVM, Goldsky offers an easy-to-use platform for building subgraphs and real-time data replication pipelines. Goldsky’s self-serve products can be used independently or together to power a data stack:

  1. Goldsky Subgraphs enable developers to intelligently extract blockchain data with ease, handling reorgs, RPC provider failures, and other complexities automatically. Goldsky provides a high-performance hosted subgraph offering compatible with the open-source graph-node spec, featuring enhanced developer experiences such as webhooks and advanced analytics.
  2. Goldsky Mirror allows developers to replicate subgraph data or chain-level streams directly to a data store of their choosing, enabling flexible usage in front-end and back-end applications. Mirror supports high-throughput, low-latency, and parallelizable indexing, facilitating chain-level data use cases not otherwise possible.

For more detailed information on Goldsky’s capabilities, visit their website and their documentation.

Embracing the Future Together

The integration of Goldsky with IOTA EVM marks a significant step in unlocking new data-driven potential within the IOTA ecosystem. This collaboration is expected to pave the way for a data-driven future in the distributed ledger technology (DLT) space.

Image source: Shutterstock




Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest News

Enhancing Agent Planning: Insights from LangChain | IDOs News

Avatar

Published

on

Enhancing Agent Planning: Insights from LangChain | IDOs News
Enhancing Agent Planning: Insights from LangChain | IDOs News




Alvin Lang
Jul 21, 2024 04:57

LangChain explores the limitations and future of planning for agents with LLMs, highlighting cognitive architectures and current fixes.





According to a recent LangChain Blog post, planning for agents remains a critical challenge for developers working with large language models (LLMs). The article delves into the intricacies of planning and reasoning, current fixes, and future expectations for agent planning.

What Exactly Is Meant by Planning and Reasoning?

Planning and reasoning by an agent involve the LLM’s ability to decide on a series of actions based on available information. This includes both short-term and long-term steps. The LLM evaluates all available data and decides on the first step it should take immediately, followed by subsequent actions.

Most developers use function calling to enable LLMs to choose actions. Function calling, first introduced by OpenAI in June 2023, allows developers to provide JSON schemas for different functions, enabling the LLM to match its output with these schemas. While function calling helps in immediate actions, long-term planning remains a significant challenge due to the need for the LLM to think about a longer time horizon while managing short-term actions.

Current Fixes to Improve Planning by Agents

One of the simplest fixes is ensuring the LLM has all the necessary information to reason and plan appropriately. Often, the prompt passed into the LLM lacks sufficient information for reasonable decision-making. Adding a retrieval step or clarifying prompt instructions can significantly improve outcomes.

Another recommendation is changing the cognitive architecture of the application. Cognitive architectures can be categorized into general-purpose and domain-specific architectures. General-purpose architectures, like the “plan and solve” and Reflexion architectures, provide a generic approach to better reasoning. However, these may be too general for practical use, leading to the preference for domain-specific cognitive architectures.

General Purpose vs. Domain Specific Cognitive Architectures

General-purpose cognitive architectures aim to improve reasoning generically and can be applied to any task. For example, the “plan and solve” architecture involves planning first and then executing each step. The Reflexion architecture includes a reflection step after task completion to evaluate correctness.

Domain-specific cognitive architectures, on the other hand, are tailored to specific tasks. These often include domain-specific classification, routing, and verification steps. The AlphaCodium paper demonstrates this with a flow engineering approach, specifying steps like coming up with tests, then a solution, and iterating on more tests. This method is highly specific to the problem at hand and may not be applicable to other tasks.

Why Are Domain Specific Cognitive Architectures So Helpful?

Domain-specific cognitive architectures help by providing explicit instructions, either through prompt instructions or hardcoded transitions in code. This method effectively removes some planning responsibilities from the LLM, allowing engineers to handle the planning aspect. For instance, in the AlphaCodium example, the steps are predefined, guiding the LLM through the process.

Nearly all advanced agents in production utilize highly domain-specific and custom cognitive architectures. LangChain makes building these custom architectures easier with LangGraph, designed for high controllability, which is essential for creating reliable custom cognitive architectures.

The Future of Planning and Reasoning

The LLM space has been evolving rapidly, and this trend is expected to continue. General-purpose reasoning is likely to become more integrated into the model layer, making models more intelligent and capable of handling larger contexts. However, there will always be a need to communicate specific instructions to the agent, whether through prompting or custom cognitive architectures.

LangChain remains optimistic about the future of LangGraph, believing that as LLMs improve, the need for custom architectures will persist, especially for task-specific agents. The company is committed to enhancing the controllability and reliability of these architectures.

Image source: Shutterstock



Continue Reading

Latest News

Binance (BNB) Unveils CPT Framework to Analyze Crypto Market Dynamics | IDOs News

Avatar

Published

on

Binance (BNB) Unveils CPT Framework to Analyze Crypto Market Dynamics | IDOs News
Binance (BNB) Unveils CPT Framework to Analyze Crypto Market Dynamics | IDOs News




Rebeca Moen
Jul 21, 2024 09:51

Binance Research introduces the CPT Framework to analyze crypto market dynamics, focusing on capital, people, and technology as key structural drivers.





Binance Research has introduced a comprehensive framework to analyze the current state of the cryptocurrency market, termed the CPT Framework. This model aims to shed light on both short- and long-term drivers influencing market dynamics, according to Binance Research.

The past few months have been challenging for the crypto markets. Following a rapid rise at the start of the year, the market has been trading within a range. June saw an 11.4% decline in total crypto market capitalization month-on-month, despite some recent relief. As of now, the market remains 14% down from its March peak.

Drivers of Market Weakness

Several market events have contributed to the recent decline in crypto prices. Key among these was the distribution of 140,000 BTC (approximately $9 billion) to Mt. Gox creditors starting July 5. Additionally, the German government transferred 50,000 BTC (~$3.2 billion) to centralized exchanges and market makers between June 19 and July 13. The U.S. government also transferred 3,940 BTC (worth $248 million) to Coinbase Prime on June 26. Despite these large-scale disposals, some mitigating factors suggest that the impact may be short-lived.

Introducing the CPT Framework

Binance’s CPT Framework categorizes structural market factors into three distinct areas: Capital, People, and Technology. Each of these factors plays a crucial role in shaping the long-term health of the crypto market.

1. Capital

The influx of new money into the crypto ecosystem has slowed. This stagnation has resulted in a “Player vs. Player” (PvP) market, where participants compete for returns. Indicators such as stablecoin supply stagnation, a slowdown in funds raised by projects, and outflows from spot BTC ETFs highlight this trend.

Key Takeaways:

  • New capital is essential for sustainable market growth.
  • Attracting new capital requires appealing to investors across primary, secondary, and traditional finance markets.
  • Strong fundamentals and clear narratives are beneficial in attracting and retaining investor interest.

2. People

Market participants have faced challenges in generating sustainable returns. Retail users, institutional investors, project teams, market makers, and regulators have all been impacted by high valuations and sell pressure from token unlocks. Falling trading volumes since March further indicate a challenging environment.

Key Takeaways:

  • High valuations and low initial circulating supplies pose long-term structural challenges.
  • Increased awareness and research on tokenomics can help mitigate these issues.
  • Support for high-quality projects with small to medium market capitalization is crucial for a healthy market environment.

3. Technology

Technological advancements in blockchain and crypto, such as scaling solutions and user-focused developments, are crucial for onboarding new users. However, the focus remains disproportionately on infrastructure projects, which need to be balanced with the development of diverse and innovative dApps.

Key Takeaways:

  • Technological innovations attract a broader audience by providing tangible use cases.
  • Funding should be redirected to develop user-friendly dApps to amplify the reach of the crypto ecosystem.

Market Outlook

Despite recent challenges, Binance Research remains optimistic about the market’s outlook for the rest of the year. Several upcoming catalysts could propel the industry forward, including potential approvals of spot ETH ETFs, a favorable macro environment with potential interest rate cuts, the U.S. Presidential Election, and the Bitcoin halving event.

Market cycles consist of periods of ups and downs. Pullbacks serve as a healthy reset when there are excesses in the market. Long-term investors might see market corrections as opportunities to add to their portfolios, while risk-averse investors may consider holding their positions.

Image source: Shutterstock



Continue Reading

Latest News

NVIDIA Advances AI-Driven 6G Innovation with AI-RAN Alliance, 3GPP, and O-RAN | IDOs News

Avatar

Published

on

NVIDIA Advances AI-Driven 6G Innovation with AI-RAN Alliance, 3GPP, and O-RAN | IDOs News
NVIDIA Advances AI-Driven 6G Innovation with AI-RAN Alliance, 3GPP, and O-RAN | IDOs News




Rebeca Moen
Jul 21, 2024 05:27

NVIDIA collaborates with AI-RAN, 3GPP, and O-RAN to drive AI-driven innovations in 6G technology, focusing on AI-native tools and frameworks.





The development of 6G technology is accelerating as the 5G era advances. NVIDIA is at the forefront, working with key industry players to foster innovation and collaboration in AI-driven 6G solutions, according to the NVIDIA Technical Blog.

AI Blueprints for Radio Access Network

The Radio Access Network (RAN) is a critical component of cellular networks, and AI/ML methodologies are being integrated to manage its increasing complexity. The International Telecommunications Union (ITU) has proposed an AI-native air interface for 6G, aimed at enhancing performance through AI/ML. NVIDIA has contributed significantly to 3GPP’s Release 18 study on AI/ML for the 5G New Radio (NR) air interface and is now focusing on Release 19, which will expand AI/ML integration.

Digital Twin Networks

Digital Twin Networks (DTNs) are essential for simulating and validating AI/ML models in 6G. These networks emulate physical 5G/6G networks, allowing developers to create and test AI/ML models in a controlled environment. NVIDIA’s Aerial Omniverse Digital Twin is a next-generation simulation platform designed to support AI-native air interface research and development.

Over-the-Air Innovation Sandbox

An Over-the-Air (OTA) development platform complements DTNs by providing a real-world environment to validate and benchmark AI/ML algorithms. NVIDIA’s Aerial RAN CoLab Over-The-Air (ARC-OTA) serves as a 3GPP Release 15 compliant, full-stack network sandbox, enabling developers to test and refine their innovations.

Collaboration with Industry Leaders

NVIDIA is collaborating with the AI-RAN Alliance, 3GPP, and O-RAN to drive AI/ML-enabled innovations that will define 6G. The AI-RAN Alliance focuses on creating implementation blueprints and benchmarking AI/ML algorithms for the new AI-native RAN. Meanwhile, the O-RAN Alliance is working on an AI-focused transformation towards an open and interoperable architecture.

Future Prospects

The pace of AI/ML adoption in 6G is expected to accelerate as standards become clearer and commercial deployment approaches. NVIDIA’s 6G Developer Program, which includes over a thousand researchers, is a key platform for ongoing and future collaborations. Researchers are encouraged to join this program to contribute to the advancement of 6G technology.

Image source: Shutterstock



Continue Reading

Trending