{"id":255977,"date":"2024-10-19T16:55:26","date_gmt":"2024-10-19T16:55:26","guid":{"rendered":"https:\/\/pdfstandards.shop\/product\/uncategorized\/bs-en-61400-25-52017-tc\/"},"modified":"2024-10-25T12:24:34","modified_gmt":"2024-10-25T12:24:34","slug":"bs-en-61400-25-52017-tc","status":"publish","type":"product","link":"https:\/\/pdfstandards.shop\/product\/publishers\/bsi\/bs-en-61400-25-52017-tc\/","title":{"rendered":"BS EN 61400-25-5:2017 – TC"},"content":{"rendered":"
IEC 61400-25-5:2017 specifies standard techniques for testing of compliance of implementations, as well as specific measurement techniques to be applied when declaring performance parameters. The use of these techniques will enhance the ability of users to purchase systems that integrate easily, operate correctly, and support the applications as intended. This part of IEC 61400-25 defines: the methods and abstract test cases for compliance testing of server and client devices used in wind power plants; the metrics to be measured in said devices according to the communication requirements specified in IEC 61400-25 (all parts). This new edition includes the following significant technical changes with respect to the previous edition: – harmonization with structure and test cases in IEC 61850-10:2012; – reduction of overlap between standards and simplification by increased referencing to the IEC 61850 standard series.<\/p>\n
PDF Pages<\/th>\n | PDF Title<\/th>\n<\/tr>\n | ||||||
---|---|---|---|---|---|---|---|
90<\/td>\n | undefined <\/td>\n<\/tr>\n | ||||||
95<\/td>\n | CONTENTS <\/td>\n<\/tr>\n | ||||||
98<\/td>\n | FOREWORD <\/td>\n<\/tr>\n | ||||||
100<\/td>\n | INTRODUCTION <\/td>\n<\/tr>\n | ||||||
101<\/td>\n | 1 Scope 2 Normative references Figures Figure 1 \u2013 Conceptual communication model of the IEC 61400-25 standard series <\/td>\n<\/tr>\n | ||||||
102<\/td>\n | 3 Terms and definitions <\/td>\n<\/tr>\n | ||||||
105<\/td>\n | 4 Abbreviated terms 5 Introduction to compliance testing 5.1 General <\/td>\n<\/tr>\n | ||||||
106<\/td>\n | 5.2 Compliance test procedures <\/td>\n<\/tr>\n | ||||||
107<\/td>\n | 5.3 Quality assurance and testing 5.3.1 General 5.3.2 Quality plan <\/td>\n<\/tr>\n | ||||||
108<\/td>\n | 5.4 Testing 5.4.1 General <\/td>\n<\/tr>\n | ||||||
109<\/td>\n | 5.4.2 Device testing Figure 2 \u2013 Conceptual compliance assessment process <\/td>\n<\/tr>\n | ||||||
110<\/td>\n | 5.5 Documentation of compliance test report 6 Device related compliance testing 6.1 Test methodology <\/td>\n<\/tr>\n | ||||||
111<\/td>\n | 6.2 Compliance test procedures 6.2.1 General 6.2.2 Test procedure requirements <\/td>\n<\/tr>\n | ||||||
112<\/td>\n | 6.2.3 Test structure 6.2.4 Test cases to test a server device Figure 3 \u2013 Test procedure format <\/td>\n<\/tr>\n | ||||||
113<\/td>\n | Figure 4 \u2013 Test system architecture to test a server device Tables Table 1 \u2013 Server documentation test cases <\/td>\n<\/tr>\n | ||||||
114<\/td>\n | Table 2 \u2013 Server data model test cases <\/td>\n<\/tr>\n | ||||||
115<\/td>\n | Table 3 \u2013 Association positive test cases Table 4 \u2013 Association negative test cases <\/td>\n<\/tr>\n | ||||||
116<\/td>\n | Table 5 \u2013 Server positive test cases <\/td>\n<\/tr>\n | ||||||
117<\/td>\n | Table 6 \u2013 Server negative test cases <\/td>\n<\/tr>\n | ||||||
118<\/td>\n | Table 7 \u2013 Data set positive test cases <\/td>\n<\/tr>\n | ||||||
119<\/td>\n | Table 8 \u2013 Date set negative test cases Table 9 \u2013 Substitution positive test cases <\/td>\n<\/tr>\n | ||||||
120<\/td>\n | Table 10 \u2013 Unbuffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
121<\/td>\n | Table 11 \u2013 Unbuffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
122<\/td>\n | Table 12 \u2013 Buffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
124<\/td>\n | Table 13 \u2013 Buffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
125<\/td>\n | Table 14 \u2013 Log positive test cases Table 15 \u2013 Log negative test cases <\/td>\n<\/tr>\n | ||||||
126<\/td>\n | Table 16 \u2013 Control model test cases <\/td>\n<\/tr>\n | ||||||
128<\/td>\n | Table 17 \u2013 DOns test cases Table 18 \u2013 SBOns test cases <\/td>\n<\/tr>\n | ||||||
129<\/td>\n | Table 19 \u2013 DOes test cases <\/td>\n<\/tr>\n | ||||||
130<\/td>\n | Table 20 \u2013 SBOes test cases <\/td>\n<\/tr>\n | ||||||
131<\/td>\n | 6.2.5 Test cases to test a client device Table 21 \u2013 Time positive test cases Table 22 \u2013 Time negative test cases <\/td>\n<\/tr>\n | ||||||
132<\/td>\n | Figure 5 \u2013 Test system architecture to test a client device Table 23 \u2013 Client documentation test case <\/td>\n<\/tr>\n | ||||||
133<\/td>\n | Table 24 \u2013 Client data model test case Table 25 \u2013 Association positive test cases <\/td>\n<\/tr>\n | ||||||
134<\/td>\n | Table 26 \u2013 Association negative test cases Table 27 \u2013 Server positive test cases <\/td>\n<\/tr>\n | ||||||
135<\/td>\n | Table 28 \u2013 Server negative test cases <\/td>\n<\/tr>\n | ||||||
136<\/td>\n | Table 29 \u2013 Data set positive test cases Table 30 \u2013 Data set negative test cases <\/td>\n<\/tr>\n | ||||||
137<\/td>\n | Table 31 \u2013 Substitution test cases Table 32 \u2013 Unbuffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
138<\/td>\n | Table 33 \u2013 Unbuffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
139<\/td>\n | Table 34 \u2013 Buffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
140<\/td>\n | Table 35 \u2013 Buffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
141<\/td>\n | Table 36 \u2013 Log positive test cases Table 37 \u2013 Log negative test cases <\/td>\n<\/tr>\n | ||||||
142<\/td>\n | Table 38 \u2013 Control model positive test cases Table 39 \u2013 Control model negative test cases <\/td>\n<\/tr>\n | ||||||
143<\/td>\n | Table 40 \u2013 SBOes test cases Table 41 \u2013 SBOns test cases <\/td>\n<\/tr>\n | ||||||
144<\/td>\n | Table 42 \u2013 DOes test cases Table 43 \u2013 DOns test cases <\/td>\n<\/tr>\n | ||||||
145<\/td>\n | 6.2.6 Acceptance criteria 7 Performance tests 7.1 General Table 44 \u2013 Time positive test cases Table 45 \u2013 Time negative test cases <\/td>\n<\/tr>\n | ||||||
146<\/td>\n | 7.2 Communication latency \u2013 Transfer time test introduction <\/td>\n<\/tr>\n | ||||||
147<\/td>\n | 7.3 Time synchronisation and accuracy 7.3.1 Time Sync test introduction Figure 6 \u2013 Performance testing (black box principle) <\/td>\n<\/tr>\n | ||||||
148<\/td>\n | 7.3.2 Time Sync test methodology 7.3.3 Testing criteria Figure 7 \u2013 Time synchronisation and accuracy test setup <\/td>\n<\/tr>\n | ||||||
149<\/td>\n | Annex A (informative) Examples of test procedure template A.1 Example 1 A.2 Example 2 <\/td>\n<\/tr>\n<\/table>\n","protected":false},"excerpt":{"rendered":" Tracked Changes. Wind energy generation systems – Communications for monitoring and control of wind power plants. Compliance testing<\/b><\/p>\n |