The United States has formulated a ten-year semiconductor plan, with five major directions to shape the new future of chip technology

Publisher:chwwdchLatest update time:2020-10-20 Source: 半导体行业观察 Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

Recently, the US SIA and SRC jointly published a report entitled "Semiconductor Ten-Year Plan". According to them, this plan was jointly developed by leaders from academia, government and industry. It identified five directions that they believe will shape the future of chip technology. The report calls on the US government to make a federal investment of $3.4 billion per year over the next decade to fund semiconductor research and development in these five areas.

 

The following is the article text:

 

The U.S. semiconductor industry leads the world in innovation, mainly due to its active investment in R&D spending. Statistics show that most of them spend nearly 20% of their revenue on R&D each year, which is second only to the pharmaceutical industry. In addition, federal government funding for semiconductor R&D has also become a catalyst for private R&D spending. Semiconductor R&D investments from private companies and the federal government have jointly maintained the pace of innovation in the United States, making it a global leader in the semiconductor industry. These R&D investments have promoted the development of innovative and commercially viable products, directly contributing significantly to the U.S. economy and employment.

 

The current hardware-software (HW-SW) paradigm in information and communication technology (ICT) has become ubiquitous, thanks to continuous innovations in software and algorithms, system architecture, circuits, devices, materials, and semiconductor process technologies. However, ICT is facing unprecedented technical challenges to maintain its growth rate in the next decade. These challenges mainly come from various fundamental limitations of semiconductor technology, which reduce the generational improvement of energy efficiency of information processing, communication, storage, sensing, and actuation.

 

Long-term sustainable ICT growth will rely on breakthroughs in semiconductor technology capabilities to enable holistic solutions to information processing efficiency issues. Breakthrough innovations are needed in areas such as software, systems, architecture, circuits, device structures, and related processes and materials, which require timely and well-coordinated multidisciplinary research efforts.

 

In order to maintain the status of American semiconductors, SRC and SIA jointly launched this 10-year semiconductor plan, which includes research priorities in information processing, sensing, communications, storage and security to ensure the sustainable growth of the semiconductor and ICT industries: Currently, information and communication technologies are facing five major changes. This report will also explore the major changes in several basic technologies in the world and tap into industry opportunities.

 

Here are five major changes:

 

Big change 1: Fundamental breakthroughs in analog hardware are needed to produce interfaces for the world’s smart machines that can sense, sense, and reason;

 

Big change 2: Growth in memory demand will outstrip global silicon supply, creating opportunities for entirely new memory and storage solutions;

 

Big change 3: Continuously available communications require new research directions to address the imbalance between communication capacity and data generation rate;

 

Big change 4: Breakthroughs in hardware research are needed to address security challenges arising from highly connected systems and artificial intelligence;

 

Big change 5: The growing energy demands of computing and global energy production are creating new risks, and new computing models provide opportunities to greatly improve energy efficiency;

 

According to previous reports by SIA, in the new era of semiconductors, maintaining and strengthening the United States' leadership in the ICT field requires a continued increase in federal investment of $3.4 billion per year over the next decade (i.e., tripling federal funding for semiconductor research) to conduct large-scale, industry-related basic semiconductor research. (The Ten-Year Plan Executive Committee made recommendations on the allocation of the additional $3.4 billion in annual investment in the five major shifts identified in the Ten-Year Plan.

 

The allocation is based on market share trends and our analysis of R&D needs for different semiconductor and ICT technologies). The main objectives of this ten-year plan include: 1. Identify the important trends and applications that drive the development of information and communication technology, as well as the related barriers and challenges; 2. Quantitatively assess the potential and status of the five major changes that will affect future ICT technology; 3. Identify the basic goals and indicators that change the current development trajectory of semiconductor technology.

 

Big change 1: A fundamental breakthrough is needed in analog hardware

 

According to our predictions, in the future, we will need fundamental breakthroughs in analog hardware in order to produce interfaces for global intelligent machines that can perceive, sense, and reason. Analog electronics process signals of continuously variable and multiple shapes in the real world (compared to digital electronics, which are usually standard shapes and require only two levels, 1 or 0). The field of analog electronics encompasses multiple dimensions, as shown in Figure 1. In addition, all inputs that humans can perceive are analog, which requires biomimetic solutions for world-machine interfaces based on ultra-compressed sensing capabilities and low operating power (Figure 2).

 

 

Figure 1. Dimensions of analog electronics (left)

 

Figure 2. The brain's perception and reasoning abilities are based on ultra-compressed perception, which can reduce 100,000 pieces of data and operate at very low energy consumption.

 

The physical world is analog in nature, and the "digital society" has an increasing demand for advanced analog electronic devices to enable the interaction between the physical and computer worlds. "Sensing the environment around us is the foundation of the next generation of artificial intelligence, and the next generation of artificial intelligence devices will have perception and reasoning capabilities. The global machine interface is at the core of the current information-centric economy. For example, the next wave of advanced manufacturing revolution is expected to come from the next generation of analog-driven industrial electronics, including sensing, robotics, industrial, automotive, medical, etc. For mission-critical applications, the reliability of electronic components is a priority. For example, today analog chips account for 80% of automotive electronics failures, which is ten times more serious than digital chip failures.

 

The estimated total analog information generated from the physical world is equivalent to ~1034 bits/second. For reference, the total human sensory throughput is ~1017 bits/second (Figure 3). Therefore, our ability to perceive the physical world is significantly limited. There are huge opportunities for future analog electronics to enhance human sensory systems, which will have significant economic and social effects. For example, multimedia designed for human sensory and cognitive systems, including neural system interfaces and communication technologies.

 

This could lead to new human-centric technologies, such as multi-sensory-based medical diagnosis and treatment, full virtual reality with virtual aroma synthesizers, or active odor elimination based on indoor air quality. This could lead to new human-centric technologies, such as multi-sensory-based medical diagnosis and treatment, full virtual reality with virtual aroma synthesizers, or active odor elimination based on indoor air quality.

 

 

Figure 3: Trends in installed sensing capacity worldwide

 

Today, the ability to generate analog data is growing faster than our ability to use it intelligently. This will become even more severe in the near future, and data from our lives and IoT sensors may create a torrent of analog data that obscures valuable information when we need it most. Sensor technology is experiencing exponential growth, with an estimated 45 trillion sensors by 2032, generating 1 million zettabytes (1027 bytes) of data per year. This is equivalent to ~1020 bit/second, thereby exceeding the overall throughput of human perception.

 

Therefore, extracting key information from the predicted data deluge and applying it in an appropriate manner is the key to harnessing the data revolution. Therefore, the grand goal of simulation is to increase useful/actionable information with less energy and data bits through revolutionary technologies, such as reducing sensory-analog-information with an actual compression/reduction ratio of 105:1.

 

For many real-time applications, the value of sensory data is short-lived, sometimes only a few milliseconds. The data must be used within that timeframe and, in many cases, locally for latency and security reasons. Therefore, pursuing breakthrough advances in information processing techniques such as developing layered perception algorithms that enable understanding of the environment from raw sensor data is an essential requirement. New computational models such as analog “approximate computing” are required. This is consistent with the ambitious goal #5 of discovering an entirely new “computational trajectory” outlined later in this paper.

 

New analog technologies can also provide dramatic advances in communications technology. Even in computer-to-computer communications, analog interfaces are needed over long distances. The ability to collect, process, and communicate analog data at the input/output (I/O) boundary is critical to the future of the Internet of Things and Big Data. Advances in analog technology in the terahertz realm will be required for future sensing and communications needs. A call to action for analog interfaces to connect the physical and digital worlds. Our collective ability to access information about the physical world through analog signals is 1,000 billion times less than the available information, and breakthrough advances in analog electronics will soon be needed. New approaches to sensing, such as sense-to-action, analog “artificial intelligence” (AI) platforms, brain-inspired/neuromorphic and hierarchical computing, or other solutions will be necessary.

 

Breakthrough advances in information processing, such as developing perception algorithms that enable understanding of the environment from raw sensor data, are a fundamental requirement. New computing paradigms, such as analog “approximate computing” that can trade energy and computation time for accuracy of output (presumably how the brain does this), are required. New analog technologies will bring huge advances in communication technology. The ability to collect, process, and communicate analog data at the input/output boundary is critical to the future world of the Internet of Things and Big Data.

 

Additionally, analog development methodologies need to see a step-up in productivity (10x or greater) to promptly address the application explosion. In summary, collaborative research to establish revolutionary paradigms for future energy-efficient analog integrated circuits for a broad range of future data types, workloads, and applications is necessary. Invest $600 million per year in this decade toward new development trajectories for analog electronics. Selected priority research topics are outlined below:

 

 

Grand Goal 1: Simulate compression/reduction of information with an actual compression/reduction ratio of 105:1, driving actual usage of information and “data” in a way more similar to the human brain.

 

Big change 2: Brand-new memory and storage solutions

 

We believe that in the future, the growth of memory demand will exceed the global silicon supply, which provides opportunities for new memory and storage solutions. With major innovations in devices, circuits and architectures, new solutions in memory and storage technologies are needed for future ICT. By the end of this decade, the continuous improvement of ICT energy consumption and performance will become stagnant because storage technology, as the underlying memory, will face scaling limitations.

 

At the same time, the amount of training data for AI applications is exploding with no limits. It is becoming increasingly clear that collaborative innovations from materials and devices to circuit and system-level functions, most likely using as-yet unexplored physical principles, will be key to achieving new levels of bit density, energy efficiency, and performance in future information processing applications.

 

The global demand for data storage is growing exponentially, which requires excessive physical resources to support the ongoing data explosion, and today's storage technologies will not be sustainable in the near future. Therefore, new fundamental solutions are needed for data/information storage technologies and methods. Figure 4 shows the forecast of global data storage needs - including conservative estimates and upper limits. As shown in Figure 4, future information and communication technologies will generate a large amount of data, far exceeding today's data flows.

 

Currently, the production and use of information is growing exponentially, and by 2040, the amount of data stored globally is estimated to be between 1024 (10 to the 24th power) and 1028 (10 to the 28th power) bits. It is worth noting that while a single bit weighs 1 picogram (10-12 grams) in the final extended NAND flash memory, the total mass of a silicon wafer storing 1026 (10 to the 26th power) bits is about 1010 (10 to the 10th power) kilograms, which will exceed the world's total available silicon supply (Figure 5).

 

 

Figure 4: Global demand for memory and storage is expected to outpace the amount of silicon available to be converted into silicon wafers worldwide.

 

Global demand for traditional silicon-based memory is growing exponentially (Figure 4), while silicon production is growing only linearly (Figure 5). This discrepancy makes silicon-based memory prohibitively expensive for Zetta-scale “big data” deployments within 20 years.

 

Grand Goal #2: Develop emerging storage and storage carriers with >10-100X density and improve energy efficiency at each level of the storage structure. Grand Goal #2b: Grand Goal #3b: Discover storage technologies with >100x storage density capabilities and new storage systems that can take advantage of these new technologies.

 

 

Figure 5: Global silicon wafer supply: data changes from 1990 to 2020 and future trend forecasts

 

In addition, memory, such as DRAM, is an important component. Without "reinventing" the computing memory system, further development of computers is impossible, and the "reinventing" here includes the physical level of the device, the memory architecture and the implementation of the physical layer. For example, traditional embedded non-volatile memory cannot be extended below 28 nanometers, so alternatives are needed.

 

Finally, new memory solutions must be able to support a variety of emerging applications such as artificial intelligence, large-scale heterogeneous high-performance and data center computing, and a variety of mobile applications that meet the harsh environment requirements of the automotive market.

 

Call to Action Fundamental breakthroughs in memory and data storage will soon be required. Collaborative research across the industry chain "from materials to devices, to circuits, to architecture, processing and solutions" is necessary to provide high-capacity energy-efficient memory and data/information storage solutions for a wide range of future applications.

 

Invest $750 million per year in new development tracks for memory and storage during this decade. Selected priority research themes are outlined below:

 

 

Big change 3: Communications requires new research directions

 

According to our view, continuously available communications require new research directions, and addressing the imbalance between communication capacity and data generation rate is another focus that we must focus on. The current situation in developed countries is characterized by always-available communications and connectivity, which has a huge impact on all aspects of life. Cloud storage and computing is one manifestation of this. The ability to obtain data from anywhere and send it to anywhere has changed the way we do business as well as personal habits and lifestyles. Social networks are an example.

 

However, the main concept of the cloud is based on the assumption of constant connectivity. Furthermore, as we become more connected, the need to communicate becomes more universal. As shown in Figure 6, the gap between the world’s technological information storage needs and communication capabilities is growing, which is a worrying trend. For example, while it is currently possible to transfer the world’s stored data in less than a year, it is expected that by 2040 it will take at least 20 years to transfer.

 

The crossover of global storage and communications is expected to occur around 2022, which could have a huge impact on ICT. Despite the growing edge computing of AI systems to meet privacy and faster response times, the explosion of information generated and stored will require huge growth in cloud storage and communication infrastructure.

 

 

Figure 6: The intersection point indicates that the data generated exceeds the world's technological information storage and communication capabilities, creating data transmission limitations.

 

Grand Goal #3a: Advanced communications technology to enable movement of all stored 100-1000 zettabyte/year of data at peak rates of 1Tbps@<0.1nJ/bit.

 

Ambition #3b: Develop intelligent and agile networks that efficiently utilize bandwidth to maximize network capacity.

 

Call to Action To meet the growing demands, communications will need to evolve radically. For example, cloud technology is likely to undergo significant changes, with a focus shifting to edge computing and local data storage.

 

Broadband communications will extend beyond smartphones to augmented reality, virtual meetings and smart office settings. New capabilities will enrich the user experience through new use cases and new vertical markets. This requires collaborative research across a broad agenda aimed at establishing revolutionary paradigms to support widespread adoption of future high-capacity, energy-efficient communications.

 

The U.S. Department of Energy's Office of Science published a report in March 2020 to identify potential opportunities and explore scientific challenges for advanced wireless technologies. Challenges will include the expansion of wireless communication technologies into the THz region, the interaction of wireless and wired technologies, new methods for network encryption, the increasingly important security, new architectures for millimeter waves, device technologies to maintain bandwidth and power requirements, packaging and thermal control. Over this decade, $700 million will be invested annually in new communication technologies. The selected priority research topics are outlined below:

 

 

Big change 4: Hardware research needs a breakthrough

 

Based on our observations, in the future, hardware research breakthroughs are needed to address security challenges that emerge in highly interconnected systems and artificial intelligence. Today’s highly interconnected systems and applications require security and privacy (Figure 7). Corporate networks, social networks, and autonomous systems are all built on the assumption of reliable and secure communication, but are also subject to a variety of threats and attacks, from the leakage of sensitive data to denial of service. The security and privacy field is undergoing rapid changes as new use cases, new threats, and new platforms emerge. For example, the emergence of quantum computing will bring new threat vectors, which will create vulnerabilities in existing encryption methods.

 

Therefore, new encryption standards that are resistant to quantum attacks must be developed and their impact on system performance must be considered. In addition, privacy has become a major policy issue that is gaining increasing attention from consumers and policymakers around the world. Technical approaches to improving privacy include obfuscating or encrypting data when it is collected or released.

 

In the other direction, devices have permeated every aspect of the physical world, so trust in these devices becomes a security issue. And security has never been more important. The safety and reliability of systems need to consider malicious attacks in addition to the traditional problems of random failures and degradation of physical world systems. The security of cyber-physical systems needs to consider how to operate normally or fail even after an attack. We need intelligent algorithms to sift through contextual data to assess trust and do secure sensor fusion over time.

 

This is a difficult problem because of the huge variety and volume of contextual data - the systems of the future are effectively systems with unlimited communication and signaling possibilities. For example, cars can communicate with each other and with roadside infrastructure. Like humans, we need to enhance the intelligence of systems to trust or distrust everything they perceive.

 

 

Figure 7: Secure system view Our hardware is also changing.

 

Complexity is the enemy of security, and today’s hardware platforms are extremely complex due to performance and energy efficiency drivers. Modern SoC designs incorporate a range of special purpose accelerators and IP blocks. The security architecture of these systems is complex because these systems are now tiny distributed systems and we must build distributed security models with different trust assumptions for each component. In addition, these components are often sourced from third parties, which means the hardware supply chain needs to be trusted.

 

The pursuit of performance has also led to some subtle problems in microarchitecture. For example, many existing hardware platforms are vulnerable to speculative execution side-channel issues, which were exposed in Spectre and Meltdown. Driven by these and other issues, completely new hardware designs are needed in the future.

 

Today’s work is dominated by artificial intelligence. Many safety systems, for example, use anomaly detection to identify attacks or contextual authentication using functional analysis. The capabilities of AI are constantly increasing, and the applications for these trustworthy systems are growing. However, it is not clear how trustworthy AI can be in these systems. This is not only a problem for safety systems, but also for general systems with implicit trust assumptions, for example, visual object detection in autonomous vehicles. Researchers have shown that small perturbations to an image can cause a neural network model to reach incorrect conclusions.

 

A small sticker placed on a stop sign can cause a model to classify a 45 speed limit sign. Other applications of deep learning systems have similar trust issues: the output of speech recognition can be manipulated by subtle changes in the audio, or malware can go undetected due to small changes in the binary. The vulnerability of deep learning models is related to their unpredictability. Neural networks are black boxes with no explanation for their decisions.

 

Other important issues with neural networks are algorithmic bias and fairness. We need ways to make deep learning systems more trustworthy, explainable, and fair. Finally, the systems we must protect have become incredibly complex over the past decade. The cloud has become the standard for outsourcing compute and storage while maintaining control. We are still grappling with the security challenges that cloud computing presents—multi-tenancy, vendor assurances, and privacy—while cloud computing offerings continue to increase in complexity. The cloud now offers trusted execution environments as well as dedicated, shared hardware and software.

 

At the same time, there is growing interest in edge computing as we realize that the cloud lacks the performance and privacy guarantees of nearby computing infrastructure. The heterogeneous nature of the edge means that trust in edge computing service providers is a major issue, and of course, the security of IoT devices has been a concern for many years.

 

We must make secure development easier for resource-constrained, low-cost devices. Even when careful security design is used, extreme environments can create difficulties. Compounding the problem is that systems at every level are becoming more complex—modern system-on-a-chip designs contain a range of special-purpose accelerators and IP blocks, essentially small distributed systems, and we must build distributed security models for each component, with different trust assumptions for each.

 

Call to Action Today’s systems are growing at an astonishing rate in terms of intelligence and ubiquity. At the same time, the increasing size and complexity of these systems forces hardware specialization and optimization to meet performance challenges. All of these performance advances must go hand in hand with advances in security and privacy. For example, protecting against weaknesses in machine learning or traditional cryptography, protecting the privacy of personal data, and addressing weaknesses in the supply chain or hardware.

 

Grand Goal 4: Develop security and privacy advances that keep pace with technology, new threats, and new use cases, such as trusted and secure autonomous and intelligent systems, secure future hardware platforms, and emerging post-quantum and distributed cryptographic algorithms.

 

Over the decade, $600 million will be invested annually in new developments in ICT security. The selected priority research themes are outlined below:

 

 

Big change 5: New computing model

 

In our view, the growing energy demands of computing and global energy production are creating new risks to coexistence, and new computing models offer opportunities to greatly improve energy efficiency. Rapid advances in computing technology have delivered greater capabilities with each generation of products in nearly every market segment, including servers, PCs, communications, mobile, automotive, and entertainment. These advances, brought about by decades of R&D investments by private industry and government, have resulted in exponential growth in computing speed, energy efficiency, circuit density, and cost-effective production capabilities. Continuous innovation in software and algorithms, system architecture, circuits, devices, materials, and semiconductor process technologies has been the foundation for this growth rate.

 

While this trend has continued for decades, successfully overcoming many technological challenges, the recognition that conventional computing is approaching fundamental limits in energy efficiency now creates challenges that are more difficult to overcome. Explosive innovation in information representation, information processing, communication, and information storage is therefore urgent and critical to sustained economic growth and America’s technological leadership.

 

As the amount of computation increases each year, the number of bits used to support these computations will also increase. It is expected that by 2050, we will be processing nearly 1044 bits. As shown in Figure 8a, the total energy consumption of general computing continues to grow exponentially, doubling approximately every three years, while the world's energy production has only increased linearly, increasing by about 2% per year.

 

The ever-increasing global computational energy is driven by the ever-increasing demand for computation (Figure 8b), despite which the chip-level energy per bit transition in a computational processor unit (e.g., CPU, GPU, FPGA) has been decreasing over the past 40 years (as dictated by Moore’s Law) to 10 aJ or 10-17 J in current processors. However, the demand for computational growth is outpacing the progress of Moore’s Law. Moore’s Law is currently slowing down because device scaling is approaching fundamental physical limits. If the exponential growth in computational energy is unconstrained, market dynamics will limit the growth in computing power, which will lead to a flattening of the energy curve (the “market dynamics limit” scenario in Figure 8a). Therefore, fundamental improvements in the energy efficiency of computing are needed to avoid the “limit” scenario. The fundamental challenge is bit utilization in computing, i.e., the number of unit bit transitions required to implement one computational instruction.

 

The current CPU computational trajectory is described by a power formula (as shown in Figure 9), with an exponent bounded by p~2⁄3. The theoretical basis for the observed trajectory and exponent value is not clearly understood, so the theoretical basis for the computation needs to be further developed. It can be found that if the exponent in the formula can be increased by ~30%, the computational efficiency and energy consumption will be improved by a factor of 1 million. As shown in Figure 9, the "new trajectory" in the figure illustrates this.

 

 

Figure 8a: Total energy of computation: The solid yellow line shows continuing the current computation trajectory while improving the energy performance of the device. The dashed line shows the scenario where “limits from market dynamics” prevent further increases in world computational power, causing the energy curve to flatten. The blue box shows the scenario where a completely new computational trajectory is discovered.

 

 

Figure 8 (b): World technical installed capacity of computing information (zip) from 2010 to 2050. The solid yellow line shows the current trend (according to Hilbert and Lopez4). The dashed yellow line shows a “limited market dynamics” scenario, in which the world’s computing power stops growing further due to limited energy capacity. The blue box shows a scenario where a completely new computing trajectory is discovered.

 

 

Figure 9: Current CPU compute trajectory Call to action People will soon require a revolutionary change in computer technology. Computational loads continue to grow exponentially, as evidenced by the growth of “AI” applications and training needs.

 

New computing approaches such as in-memory computing, special purpose compute engines, different AI platforms, brain-inspired/neuromorphic computing, quantum computing, or other solutions will be necessary and will need to be combined in a heterogeneous manner. The range of potential heterogeneous computing architectures described in a recent National Science and Technology Council (NSTC) report asserts that an interdisciplinary, cross-functional approach will be needed to achieve commercially viable and manufacturable solutions with a long-term viability (at least a decade) to replace mainstream digital approaches.

 

This document aims to stimulate collaborative research "from materials to architectures and algorithms" to establish revolutionary paradigms that support a broad range of future data types, workloads, and applications for energy-efficient computing. For additional background, see the U.S. Department of Energy Office of Science, Microelectronics Basic Research Needs Workshop Report. Invest $750 million per year over this decade to change the trajectory of computing. Selected priority research topics are outlined below.

 

 

“Federal and private sector investments in semiconductor R&D have fueled the pace of innovation in the U.S. semiconductor industry, driving rapid growth throughout the U.S. and global economies,” said John Neuffer, president and CEO.

 

"However, as we enter a new era, a renewed focus on public-private research partnerships is necessary to address the dramatic changes facing chip technology. The federal government must make ambitious investments in semiconductor research to keep the United States at the forefront of semiconductors and the game-changing future technologies they enable," said John Neuffe.

 

“The future holds limitless potential for semiconductor technology, with emerging applications such as artificial intelligence, quantum computing, and advanced wireless technologies expected to deliver immeasurable societal benefits,” said Dr. Todd Younkin, SRC president and CEO. “The 10-Year Plan provides a blueprint for how we can turn this potential into reality. By working together, we can advance semiconductor technology so it remains highly competitive and on top of the innovation wave.”

 

Reference address:The United States has formulated a ten-year semiconductor plan, with five major directions to shape the new future of chip technology

Previous article:In the first three quarters of 2020, the output of integrated circuits reached 182.2 billion pieces, a year-on-year increase of 14.7%.
Next article:How to closely integrate chip talent training with industry

Latest Semiconductor design/manufacturing Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号